Back to blog
4 min readShadab

Skeletons That Don't Flicker: Testing Real Latency with Mockfill Delays

Local APIs are too fast to exercise loading states. Use Mockfill's response delay to design skeletons and spinners against realistic latency without touching app code.

Skeletons That Don't Flicker: Testing Real Latency with Mockfill Delays header image
Mock APILoading StatesFrontend Workflow

Local APIs lie. They respond in 4 milliseconds, your skeleton flashes for one frame, and you ship a loading state that has never actually been seen by a human. Then a user on a hotel Wi-Fi opens the page, sits through 2.5 seconds of layout shift, and files a bug.

The fix is not "remember to test loading states." The fix is making loading states impossible to skip.

Why hardcoded delays are the wrong answer

Most frontend devs eventually do the same thing: drop a setTimeout into the fetch call, eyeball the spinner, then delete the timeout before committing. This works exactly once and then stops working, because:

  • It requires editing app code to test app code.
  • The delay only exists in your branch, so nobody else sees the same loading state you saw.
  • It is trivially forgotten in a commit.

Nano Banana prompt: "Stylized illustration of a code editor showing a git diff. A red highlighted line reads 'await new Promise(r => setTimeout(r, 2000))' with a small label '+ committed to main'. A small sad-face emoji floats next to it. Dark editor theme, monospace font, soft glow on the offending line, indigo accents around the editor chrome."

The Mockfill approach

Mockfill lets you set a response delay in milliseconds on any rule. The delay is applied at interception time, which means your app code stays untouched and the latency is real from the browser's point of view — same loading spinner, same skeleton, same layout shift.

  1. Identify the request that drives the screen (say, GET /api/dashboard).
  2. Create a Mockfill rule matching that method and URL.
  3. Set delayMs to something realistic. 1500–3000 ms is the honest range for "user on a slow connection."
  4. Return a normal 200 payload — the point is the wait, not the failure.
  5. Save it as a preset called Slow network so anyone on the team can flip it on.

Nano Banana prompt: "UI mockup of a browser extension panel labeled 'Mockfill', light theme. A rule editor card shows method 'GET', URL '/api/dashboard', and a prominently highlighted numeric field labeled 'Delay (ms)' set to 2000 with a soft cyan glow ring around it. Below it a JSON code block shows a small dashboard payload. Modern flat SaaS UI, indigo accents, rounded cards."

A concrete rule

{
  "status": 200,
  "headers": { "content-type": "application/json" },
  "delayMs": 2000,
  "body": { "widgets": [{ "type": "sales", "value": 40921 }] }
}

Now reload the dashboard. The skeleton stays visible long enough for you to actually look at it. Adjust the layout until the transition from skeleton to content does not jump. Adjust the skeleton shape until it is not a lie about what arrives.

What you should be testing while the delay is on

  • Skeleton ↔ content layout parity. Does the content arrive in the same grid the skeleton drew?
  • Cumulative layout shift. Does anything jump, especially images and avatars?
  • Cancel behavior. What happens if the user clicks a different tab mid-load?
  • Multiple concurrent loads. Are spinners independent, or do they flash together?
  • Error timing. If the request times out, does the UI degrade gracefully or freeze?

Nano Banana prompt: "Side-by-side comparison of two dashboard UI mockups. Left: a clean skeleton loader transitioning smoothly into a populated dashboard, green checkmark badge in the corner. Right: the same dashboard but with visible layout jumps, misaligned cards, red warning badge. Caption strip below: 'Same data, two delay tests apart.' Indigo and cyan palette, modern flat design, no photorealism."

Building a latency preset library

Once you have one delay rule, the cheapest thing you can do is build a preset library:

Preset Delay Purpose
Fast 4G 250 ms Realistic happy path
Slow 3G 1500 ms Mid-tier mobile
Hotel Wi-Fi 3000 ms Worst common case
Timeout 30000 ms Forces the timeout branch

Export the preset library as JSON, commit it to the repo, and share with QA. From then on, "test loading states" is a one-click action, not a discipline.

The limit of latency mocking

Mockfill simulates server-side delay. It does not simulate true network chaos — packet loss, DNS failures, TLS renegotiation. For deeper network conditions, layer in Chrome DevTools' network throttling on top of the Mockfill delay. The two compose: Chrome adds the network shape, Mockfill adds the deterministic response and the server-side wait.

The takeaway

Loading states are not a thing you remember to test. They are a thing you make impossible to ship without testing. A Mockfill preset library is the cheapest way to get there.

Keep reading

Related technical articles

Ship UI Before the API Exists: Browser-Native Mocking with Mockfill cover image
4 min read
Mock APIFrontend WorkflowAPI Mocking

Ship UI Before the API Exists: Browser-Native Mocking with Mockfill

How frontend teams can unblock themselves from missing or unstable backend endpoints by intercepting fetch and XHR with Mockfill's Mock API.

Read article
Copy as cURL, Paste, Repro: The Fastest Debug Loop Using Mockfill cover image
5 min read
Mock APIDebuggingBug Reproduction

Copy as cURL, Paste, Repro: The Fastest Debug Loop Using Mockfill

Turn 'I can't reproduce it locally' into a deterministic, reloadable bug reproduction in under five minutes by importing the failing request as cURL into Mockfill.

Read article
Demo Mode Without Backend Drama: Deterministic Flows with Mockfill cover image
5 min read
Mock APIProduct DemosSales Engineering

Demo Mode Without Backend Drama: Deterministic Flows with Mockfill

Run product demos and stakeholder reviews against deterministic API responses so staging outages and inconsistent data never derail a presentation again.

Read article