Jobber: An Agentic Job Search Platform Link to heading
The modern job search is a grinding, repetitive process — sifting through alert emails, cross-referencing requirements against your own experience, tailoring resumes, and making go/no-go decisions dozens of times a week. Jobber is an agentic AI platform that automates the mechanical parts of this workflow while preserving human judgment where it matters most. It also serves as a demonstration of end-to-end system design: structured AI agent orchestration, idempotent data pipelines, and a purpose-built review interface — all written in Go against PostgreSQL.
Jobber is composed of three independently developed components that form a complete pipeline from raw email to actionable decision.
Mail-Fetcher is the ingestion layer. It dispatches AI agents to parse job listing emails from platforms like LinkedIn, Glassdoor, Dice, and Indeed, extracting structured data — title, company, compensation, location, requirements — from the unstructured HTML of email alerts. The fetcher queries a message archive via MCP (Model Context Protocol), processes results through interchangeable AI backends (Claude CLI or Oz cloud agents), and upserts normalized records into PostgreSQL. Deduplication is handled via SHA-256 hashing, and a configurable lookback window prevents reprocessing. The result is a clean, canonical database of opportunities that arrives without any manual copying or data entry.
The Evaluation Framework is where agentic reasoning meets structured decision-making. A multi-step prompt pipeline analyzes each listing through several lenses: it inventories required and preferred skills, maps them against the candidate’s experience, generates a tailored resume and cover letter, produces a match assessment with an explicit rating, and flags special considerations — skill gaps, location mismatches, compensation concerns, or recruiter red flags. The pipeline culminates in a synthesized recommendation: GO, HOLD, LOW BALL, or NO GO. Prompt variants (e.g., “standard” vs. “thirsty”) allow tuning the agent’s risk tolerance. Every decision is timestamped and auditable, creating a transparent record of agent reasoning.
Curator Console is the human review interface — a server-rendered Go web application with HTML templates and Tailwind CSS, backed by PostgreSQL. It provides a dashboard with pipeline statistics, a paginated job explorer with filtering and search, detailed single-job views, and a tag management system for organizing opportunities. Schema migrations (V1 through V5) are version-controlled and auto-applied on startup. Connection pooling includes health-aware settings for transient network resilience. Soft-delete archival preserves historical data without polluting active views. The console is where a human operator exercises final judgment: reviewing agent recommendations, overriding verdicts, and tracking application status.
The architectural choices reflect a preference for operational simplicity and composability. Go was chosen for all three components for its deployment story and runtime reliability. Each component communicates through PostgreSQL as a shared substrate rather than through service-to-service APIs, making the system easy to operate, debug, and extend independently. AI integration is treated as a pipeline stage with well-defined inputs and outputs — not as a black box — so agent behavior is observable, reproducible, and swappable between providers. Secrets are managed through HashiCorp Vault and injected via direnv, keeping credentials out of source control.
Jobber demonstrates how agentic AI can be integrated into a real workflow with engineering discipline: idempotent pipelines, structured prompt design, interchangeable model backends, schema-managed persistence, and a human-in-the-loop interface that ensures the decisions which actually matter are still made by a person with context and judgment.