⚠️ EPILEPSY WARNING ⚠️

THIS PROJECT DISCUSSES AND DEMONSTRATES FLASHING / STROBING CONTENT. IF YOU HAVE PHOTOSENSITIVE EPILEPSY OR ARE SENSITIVE TO FLASHING LIGHTS, PLEASE PROCEED WITH CAUTION. THE DEMO VIDEOS BELOW CONTAIN EXAMPLES OF SCREEN FLASHING BEFORE AND AFTER FILTERING.

FlashFilter

Real-time screen flash detection and suppression.

Inspiration

We've all been there. Scrolling through reels late at night, watching YouTube, deep into a gaming session, and out of nowhere a blinding flash hits you in the face. Your eyes sting, your head starts throbbing, and your whole mental state just shifts. We got tired of it and started looking for solutions, but we were genuinely shocked to find that almost nothing exists.

Apple's "dimming" feature is the closest thing out there, but it takes 2 to 3 seconds to kick in. For someone with photosensitive epilepsy, a seizure can be triggered well before that delay is up. And when it does finally activate, it just blacks out that portion of the entire screen — you can't use that region until the flashing stops. Microsoft hasn't done anything at all. We actively searched the entire internet and came up empty. There is no viable third-party solution for this problem.

That was the moment we decided to build FlashFilter. Over 50 million people worldwide live with epilepsy, and beyond seizures, rapid flashing can trigger migraines, sensory overload, and disorientation in a much wider population. Gamers, late-night scrollers, people with sensory conditions — none of them should have to worry about what their screen might throw at them next.

What it does

FlashFilter is a background application that monitors your entire screen in real-time, detects regions that are flashing, and smooths them out instantly before they ever reach your eyes. It runs as a transparent overlay on top of everything — movies, games, browsers, any application. It works at the pixel level, so it only filters the parts of the screen that are actually flashing rather than blacking out the whole display like Apple does.

The detection latency is 8.3 milliseconds and it supports refresh rates up to 1000+ FPS. There is zero learning curve. You run a script, it starts, and it stays on in the background until you close your laptop. You don't have to think about it at all.

How we built it

We built FlashFilter using Electron with a WebGL-powered transparent overlay that captures and processes screen frames in real-time. The screen is captured using platform-native APIs, and each frame is analyzed entirely on the GPU using WebGL shaders. The shaders track per-pixel temporal intensity patterns to identify rapid oscillations — that's how we distinguish actual flashing from normal screen movement.

The key engineering achievement here is that the algorithm runs in constant time and constant memory per frame. There are no frame history buffers, no ring buffers, no growing memory allocations. Every pixel is evaluated independently using a stateless temporal filter. This means we can run at absurdly high frame rates without worrying about memory buildup or slowdowns, and we can even process a single frame multiple times over before the next one becomes available.

Challenges we ran into

The fundamental problem we kept hitting was that flashing and movement are nearly indistinguishable at the pixel level. Our first approach — just looking at pixel deltas between frames — was technically a perfect flash detector, but it turned the entire screen into a blurry mess anytime anything moved. Text scrolling looked like flashing. Dragging a window looked like flashing. Even the mouse cursor triggered it. Turns out text is one of the highest contrast patterns on your screen, and any movement of high-contrast content looks identical to flashing from a naive algorithm's perspective.

We tried spatial segmentation next — clustering pixels into contiguous blocks and tracking their position, color, and size across frames to figure out whether something moved or actually flashed. But clustering has at minimum O(n) cost just traversing the image, which becomes prohibitive at high resolutions and frame rates. This actually motivated a 3-hour detour into rewriting the entire thing in Rust using ScreenCaptureKit on macOS and the Desktop Duplication API on Windows. The native version introduced hundreds of milliseconds of latency because it was copying GPU textures into CPU memory and then re-uploading them for compute, which also caused memory usage to balloon to around 200MB from frames piling up during the transfer delay. Electron turned out to be genuinely the right tool for this.

Eventually we found an approach that is robust to movement while still catching rapid flashing. The tradeoff is that we have slightly more false negatives — some flash patterns can sneak past detection — but the user experience is dramatically better because normal screen activity is never disrupted.

Accomplishments that we're proud of

The constant time and constant memory processing is something we're really proud of. No frame buffers, no ring buffers, no growing allocations — each frame is processed independently in O(1). The 8.3ms detection latency means we're faster than a single frame at 120fps. We battle-tested the whole thing extensively: watched an entire movie through it, played Call of Duty through it, and used it as our daily driver for a full day with zero friction. Cross-platform support for macOS and Windows works out of the box.

The zero learning curve is also something we care about a lot. We specifically designed FlashFilter so that people who aren't technical can still use it. You run it once and it just works in the background. No settings to configure, no UI to learn, nothing to think about.

What we learned

Real-time computer vision is brutally hard, especially when your objects of interest are individual pixels and your deadline is 8 milliseconds. We learned that photosensitive epilepsy is more nuanced than we initially thought — red flashes are more dangerous than other colors, certain flash waveforms can slip past detection depending on their shape relative to the screen's refresh rate, and the frequency thresholds that trigger seizures vary from person to person.

We also learned that Electron gets a bad reputation that isn't always deserved. For GPU-bound workloads with a thin UI layer, it's genuinely excellent — our native Rust rewrite was actually slower due to GPU texture transfer overhead. And we learned that watching hours of strobing test patterns for research purposes will absolutely make your brain feel fuzzy.

What's next for FlashFilter

There is a lot of room for improvement. The Electron runtime means the app currently uses around 800MB of memory, which is way too much for a background utility. A native rewrite should be able to bring the executable size and memory footprint under 10MB while also enabling things like excluding the cursor from the capture stream so it doesn't affect the detector.

The detection algorithm needs work on two fronts: catching some of the trickier flash profiles that currently slip through, and weighting red-spectrum flashes more heavily since research shows they're the most likely to trigger seizures. We'd also love to move from our current collection of heuristics toward a system with formally provable safety properties.

Linux support is on the roadmap but presents its own challenges — getting a transparent overlay working across Wayland and X11 requires setting up an additional display on another TTY, piping input back to the original desktop environment, and dealing with PipeWire for framebuffer access. It's the only robust approach that works independent of the display server.

Longer term, we want to enable proper game integration so the filtering can hook directly into rendering pipelines for even lower latency. We're seriously considering putting in the effort to polish this up and bring it to market.

Share this project:

Updates