Back to blog

April 20, 2026

Your Client Doesn't File Bug Reports

Why vague client complaints become actionable only when you can replay the session, inspect the viewport, and see what actually happened.

Your Client Doesn't File Bug Reports

You wake up to a Slack message from your client:

"The page isn't working."

That's it. No screenshot. No URL. No browser. No description of what they were trying to do or what happened instead. Just five words and the implicit expectation that you'll figure it out.

You won't. Not from that message.


The Reporting Gap

Developers report bugs with steps to reproduce, expected vs. actual behavior, browser version, console output. We do this because we know that without context, a bug report is just a complaint.

Clients don't think like this. They're not being difficult — they just don't have the vocabulary or the instinct. When something breaks, they describe how it felt, not what happened technically.

You get messages like:

"The colors look weird on my end."

"I clicked the button and nothing happened."

"It was working yesterday?"

"The layout is off."

Every one of these could mean ten different things. A CSS issue. A caching problem. A failed API call. A browser-specific rendering bug. A responsive breakpoint they hit on their tablet that you never tested. You have no way to know without a 20-minute back-and-forth that frustrates both of you.


Design Feedback Is the Worst Offender

Functional bugs are at least binary — something works or it doesn't. Design feedback from clients exists in a much more painful grey zone.

"It doesn't feel right."

"Can you make it more modern?"

"The spacing seems off somewhere."

You stare at your screen. It looks fine to you. It looks exactly like the mockup they approved. But they're seeing it on a 13-inch laptop at 125% scaling in Chrome with three toolbars installed, and the hero section is getting crushed in a way you've never seen.

The problem isn't that they're wrong. The problem is that you can't see what they're seeing.

With a session replay, you can. You see their actual viewport. Their actual scroll behavior. The exact element they hovered over when they decided something "felt off." You see whether they even looked at the section you spent two days refining, or whether they scrolled past it in half a second and fixated on a component you considered done.

Design feedback stops being a guessing game when you can watch the session.


The "Nothing Happened" Mystery

The single most common client bug report is some version of "I clicked it and nothing happened." It's also the least useful, because the causes range from trivial to catastrophic:

  • The button triggered a request that failed silently.
  • The button worked but the UI didn't update.
  • They clicked next to the button, not on it.
  • JavaScript errored before the click handler attached.
  • The action succeeded but they expected different feedback.
  • They double-clicked and the second request broke something.

Without session replay, you're guessing. You ask them to try again. They say it works now (or it doesn't, but they describe it differently this time). You check the logs and find nothing because the error happened client-side. You deploy a logging patch and ask them to try again tomorrow.

With DemoTape, you open the replay. You see the click. You see the network request fail with a 422. You see the error in the console. You fix it in ten minutes instead of spending a day playing detective over Slack.


What Clients Actually Do (vs. What They Say They Did)

This applies to design reviews as much as bug reports. Clients describe their experience through the lens of what they expected, not what actually happened.

"I couldn't find the login." — They scrolled past it three times. It was visible. The label just didn't register because it said "Get Started" instead of "Login."

"The form is broken." — The form works. They missed a required field, the validation message appeared below the fold, and they never scrolled down to see it.

"The homepage looks empty." — They're on a 4K monitor. Your max-width container is leaving massive gutters. On your screen it looks balanced. On theirs it looks like a napkin floating in a swimming pool.

Every one of these is a real design issue worth fixing. But you'd never find it from the Slack message alone. You need to see their screen, their viewport, their flow.


The Async Advantage

The alternative to session replay is a live call. "Can you share your screen and show me what's happening?" It works, but it has problems:

Timing. The bug happened at 9pm. By the time you get on a call the next day, it might not reproduce.

Performance anxiety. Clients behave differently when they know you're watching. They navigate more carefully. They don't do the weird, unstructured clicking they do when they're alone — the clicking that reveals real usability problems.

No rewind. On a live call, you see it once. If you miss the failed request in the network tab, it's gone. Session replays let you scrub back, check the console at the exact moment of the click, and cross-reference the DOM state with the network timeline.

DemoTape records all of this passively. Your client uses the shared link like they normally would. You review it when you have time. No scheduling, no screen-share awkwardness, no "can you try that again?"


Turning Complaints Into Tickets

The real shift isn't technical — it's conversational. When a client says "the page isn't working," you don't need to interrogate them anymore. You open the replay, find the session, and see exactly what happened.

Your response goes from "can you describe what you saw?" to "found it — the checkout API is returning a 500 when the cart has more than 10 items. Fixing now."

That's a different relationship. The client feels heard. You look competent. The fix happens faster. Nobody spends 45 minutes in a Slack thread trying to establish basic facts.


Start Capturing Before They Complain

The best session to have recorded is the one where the bug happened. The worst time to set up replay is after a client messages you saying something is broken.

If you're sharing work with clients — especially design work, especially early-stage builds where things are still rough — capture sessions from the start. Not because you expect bugs, but because when they happen, you'll have the evidence instead of the guesswork.

npx @demotape.dev/cli — share your local build, capture every session, and never ask "can you send me a screenshot?" again.