fetch-network-simulator: A Chaos Monkey for Your Fetch Calls
Every developer knows the feeling. You build a feature, test it on localhost, ship it, and then watch it crumble the moment a user on a spotty coffee shop WiFi connection tries to load a dashboard. The problem is not your code. The problem is that your development environment is a lie -- instant responses, zero packet loss, and unlimited bandwidth. Real users live in a different world.
fetch-network-simulator is a lightweight library that intercepts the global fetch function and injects realistic network chaos into your development environment. It does not mock your APIs or replace your backend. Instead, it wraps your real API calls with configurable instability: artificial latency, random packet loss, retry logic, stale responses, concurrency limits, and bandwidth throttling. Think of it as a chaos monkey that lives inside your browser's network layer.
Why Your Localhost Needs Some Turbulence
Testing on a perfect connection means you never see how your loading spinners behave under a 3-second delay, whether your error boundaries catch dropped requests, or if your UI handles out-of-order responses gracefully. Chrome DevTools offers basic network throttling, but it is a blunt instrument that affects every request uniformly and cannot simulate packet loss, stale responses, or burst control.
fetch-network-simulator gives you fine-grained, programmable control over six different failure modes, all configured from a single function call. Zero dependencies, browser-only, development-time only.
What It Can Break
The library ships with six built-in simulation rules, each targeting a different dimension of network unreliability:
- Latency -- Adds artificial delays to responses
- Packet Loss -- Randomly fails requests at a configurable probability
- Automatic Retries -- Configurable retry logic with delay between attempts
- Stale Responses -- Simulates race conditions with out-of-order responses
- Burst Control -- Limits concurrent requests to simulate congestion
- Bandwidth Throttling -- Restricts response speed to a specified kbps
Each rule hooks into a lifecycle pipeline with beforeRequest, afterResponse, and onError phases, and the architecture is extensible if you need custom behaviors.
Installation
npm install fetch-network-simulator
yarn add fetch-network-simulator
Flipping the Switch
Wiring It Up at the Entry Point
The simulator must be initialized at your application's entry point, not inside a component. It patches the global fetch function once and applies rules to every subsequent request.
// main.ts
import { enableNetworkSimulator } from "fetch-network-simulator";
if (import.meta.env.DEV) {
enableNetworkSimulator({
debug: true,
latency: { enabled: true, delayMs: 1500 },
packetLoss: { enabled: true, lossRate: 0.2 },
});
}
The import.meta.env.DEV guard ensures the simulator never touches production builds. When debug is set to true, the console groups structured logs for every request, showing which rules fired, how long delays were injected, and whether retries occurred.
Turning It Off
When you need clean network behavior again, a single call restores the original fetch:
import { disableNetworkSimulator } from "fetch-network-simulator";
disableNetworkSimulator();
Gradually Turning Up the Pain
Mild Turbulence
Start gentle. A half-second delay on every request is enough to reveal whether your loading states are properly wired:
enableNetworkSimulator({
latency: { enabled: true, delayMs: 500 },
});
If your UI jumps straight from empty to loaded with no spinner, this is the setting that exposes it. Most users on mobile connections experience at least this much latency on API calls.
Moderate Instability
Now introduce some packet loss and retry logic. With a 30% failure rate and two retry attempts, you can verify that your error handling and retry mechanisms work together:
enableNetworkSimulator({
debug: true,
latency: { enabled: true, delayMs: 2000 },
packetLoss: { enabled: true, lossRate: 0.3 },
retry: { enabled: true, maxAttempts: 2, retryDelayMs: 500 },
});
Watch the console output carefully. The debug logs will show you exactly which requests failed, which ones were retried, and how long the entire chain took to resolve. This is invaluable for understanding whether your app gracefully degrades or silently breaks.
Severe Conditions
This configuration simulates a user on a congested network with minimal bandwidth. Only one request can fly at a time, half of them drop, and those that survive crawl through at 100 kbps:
enableNetworkSimulator({
latency: { enabled: true, delayMs: 5000 },
packetLoss: { enabled: true, lossRate: 0.5 },
burstControl: { enabled: true, maxConcurrent: 1 },
networkSpeed: { enabled: true, kbps: 100 },
});
If your app still functions under these conditions, congratulations. Your users on the London Underground will thank you.
Hunting Race Conditions
Catching Stale State Overwrites
One of the most insidious bugs in frontend development is the stale response problem. A user types a search query, the first request takes 3 seconds, the second takes 200 milliseconds, and suddenly the UI shows results for the first query because it arrived last.
enableNetworkSimulator({
debug: true,
staleResponse: { enabled: true, staleProbability: 0.5 },
latency: { enabled: true, delayMs: 2000 },
});
With staleProbability set to 0.5, half of your responses will arrive out of order. This immediately reveals whether your state management handles response ordering correctly. If you are using React Query or SWR, they handle this for you. If you are managing fetch state manually, this setting will find the bugs.
Stress-Testing Concurrent Requests
Dashboards that fire multiple API calls on mount are particularly vulnerable to concurrency issues. The burst control rule lets you simulate a bottleneck:
enableNetworkSimulator({
burstControl: { enabled: true, maxConcurrent: 2 },
latency: { enabled: true, delayMs: 1000 },
});
With only two concurrent requests allowed, a dashboard that fires ten parallel fetches will queue the rest. This reveals whether your UI handles partial data gracefully or waits for everything before rendering anything.
The Full Configuration Menu
Here is every option the simulator accepts, assembled into a single configuration object for reference:
enableNetworkSimulator({
debug: true,
latency: { enabled: true, delayMs: 1500 },
packetLoss: { enabled: true, lossRate: 0.3 },
retry: { enabled: true, maxAttempts: 3, retryDelayMs: 200 },
staleResponse: { enabled: true, staleProbability: 0.5 },
burstControl: { enabled: true, maxConcurrent: 1 },
networkSpeed: { enabled: true, kbps: 500 },
});
Each rule can be independently enabled or disabled, so you can focus on one failure mode at a time or combine them for a realistic worst-case scenario.
Where It Fits in Your Toolbox
It is worth understanding what fetch-network-simulator is not. It is not a replacement for MSW (Mock Service Worker) or Mirage JS, which mock API responses with fake data. Those tools answer the question "what if the API returns X?" while fetch-network-simulator answers "what if the network is unreliable?" They are complementary.
It is also not a replacement for Chrome DevTools network throttling, which operates at the browser transport level and affects all network activity uniformly. The simulator gives you programmable, per-feature control that you can check into source control and share with your team.
A few things to keep in mind: the library intercepts fetch only, not XMLHttpRequest or other HTTP clients. It runs in the browser only, not Node.js. And it should only ever run in development, never in production.
Wrapping Up
Building UIs that work flawlessly on localhost and shatter in production is a rite of passage that nobody needs to repeat. fetch-network-simulator lets you experience the worst of real-world networking without leaving your development server. It is zero-dependency, trivial to add, and trivial to remove. Drop it into your entry point, crank up the chaos, and fix the bugs before your users find them.