Inspiration

We are in the middle of an employer's job market. For many, a 6-month job search has become the norm. The typical advice? "It's a numbers game," "just apply 300 more times," "you only need one offer." But our tools haven't kept up. We either doomscroll through repetitive, ad-ridden feeds on LinkedIn, juggle 50 open tabs for companies that only post on their own career pages, or subscribe to dozens of alerts just to be first to apply. The current job aggregators fail at the one job they were supposed to do — aggregate jobs in one place.

What It Does

This is the first job aggregator with the potential to cover every job board on the Internet. By embracing an agentic approach and letting users tell us which companies they care about, we achieve two things at once: we stay reactive to website changes, and our scraping coverage grows proportionally with our userbase.

But we go beyond just listing open roles. We've built a sophisticated hiring-signal detection layer that helps you discover which companies are likely to hire before they even post a job. By analyzing financial reports, news from professional networks, and headcount trends, we surface momentum signals that put you ahead of the crowd. The result is a dashboard where opportunities can be filtered by category, industry, and signal strength — and where the full report, complete with source links, is accessible to contributors.

How We Built It

It started as an MVP for my own job search. Every crawler had to be manually coded to fit every website — tedious, and impossible to share with non-technical friends. Agents alone weren't reliable enough. But agents equipped with generalizable tools that only needed to be configured — not rewritten — turned out to be a different story. Suddenly, the same pipeline could scrape any job board we threw at it. The key insight was separating the what (tool logic) from the how (agent-driven configuration).

On top of that, we built a multi-layered Apify scraper and actor infrastructure to power the hiring-signal pipeline. Each layer handles a distinct signal source — financial disclosures, professional network activity, headcount data — and feeds into a unified view of company hiring momentum.

Challenges We Ran Into

Job postings are, fortunately, among the easiest things to scrape on the Internet. Companies want them to spread — they rarely add protections and often publish sitemaps for indexing. That said, some boards still end up behind heavy anti-bot measures, likely by accident rather than design. Apify proved powerful enough to handle almost any such nuisance.

Accomplishments We're Proud Of

We designed an agentic pipeline that is flexible, efficient, and reliable — a combination that took many iterations to get right. The architecture we landed on — feeding the agent just enough context to progress to the next step, never more — turned out to be the key to consistent performance.

We're also proud of how we monetized access without sacrificing the product experience. Rather than paywalls or ads, we implemented the HTTP 402 payment protocol — an elegant, emerging standard that enables micropayments from both humans and AI agents. This means other agents can hit our API, make a small contribution, and use our scraping materials for their own purposes — making <job-seek> a node in a broader agentic ecosystem, not just a consumer product.

What We Learned

Building this project crystallized several lessons about agentic system design:

  • Context is a budget, not a dump. Giving the agent the full picture upfront led to hallucinated steps and wasted tokens. Narrowing the context window to only what's needed for the current decision dramatically improved both accuracy and cost.
  • Tools beat prompts. Instead of prompt-engineering the agent into producing scraping code, we gave it well-defined, parameterized tools. The agent's job shrank from "write a crawler" to "configure this tool" — a far more constrained and reliable task.
  • Fail fast, retry cheap. Agentic pipelines will fail. The trick is making each step small enough that retrying is inexpensive and diagnosing errors is straightforward.
  • Agents are users too. Supporting machine-to-machine payments via the 402 protocol taught us that designing for agentic consumers requires the same care as designing for humans — clear contracts, predictable responses, and fair pricing.

What's Next for <job-seek>

We are launching in Switzerland — one of the toughest job markets there is. After testing the product and polishing the edges, we plan to expand across Europe and the US, followed by the rest of the world. Our goal is to build the Google of job search: a place where every open role on the Internet is one search away, and where proactive hiring signals give candidates an edge before the competition even knows a role exists. Ultimately, we want to make the future of hiring both agentic and humane — because every person deserves a faster path to finding their purpose.

Built With

Share this project:

Updates