
Domain Checkm8: AI-Assisted Bulk Domain Checker
A Go + Datastar app for bulk domain availability checking with Namecheap API, SQLite caching, Google OAuth, and an MCP server for AI agents
This one came from a very ordinary annoyance: checking domains one by one is tedious.
When youâre brainstorming names, you donât want to click through a registrar UI 30 times like youâre stamping forms in a tiny government office. You want to paste in a batch, get the answers back, and move on.
Namecheap already supports bulk lookups, so I built a proper app around that idea, then kept going. It grew into a Go and Datastar project with caching, auth, and eventually an MCP server so AI agents could check domains too.
Tech Stack Talk
The interesting bits:
- First project using Datastar for real-time SSE streaming with a Go backend, no frontend framework needed
- Google OAuth with PKCE and opaque session tokens (not JWT)
- SQLite with WAL mode as the only database, keeping things simple and deployable
- MCP (Model Context Protocol) server so AI agents can check domains, explore cached results, and manage annotations
- Namecheap API quota management with rolling windows (50/min, 700/hr, 8000/day)
- Progressive enhancement, forms work without JavaScript
Why Datastar Over a Frontend Framework
After building VideoBrev with SolidJS, I wanted to try the opposite approach. Datastar gives you real-time updates through Server-Sent Events without dragging in a full frontend framework.
The Go backend renders HTML, streams updates over SSE, and the browser handles the rest natively. No SPA to deploy. No OpenAPI dance. No frontend build pipeline trying to become the main character. For a forms-and-tables app like this, it felt like the right level of technology.
Authentication: Opaque Tokens Over JWT
I went with opaque session tokens instead of JWTs. Random tokens, hashed with SHA-256 in the database. Why?
- No JWT secret to manage or rotate
- Token revocation is just a database delete (no waiting for expiry)
- Separate access (15 min) and refresh (14 day) tokens with HttpOnly cookies
- PKCE (S256) on the OAuth flow to prevent authorisation code interception
It does mean more database lookups per request, but with SQLite in WAL mode on the same machine, those lookups are trivial. In practice, the simpler security model was worth it.
MCP Server for AI Agents
This was the most interesting bit. The app exposes a Model Context Protocol server at /mcp/server using streamable HTTP. AI agents authenticate with bearer tokens and can use tools like:
check_domains_batch- check up to 50 domains, respects cacheexplore_domains- list and filter cached domain resultsget_domain_annotations/update_domain_annotations- manage tags and notes on domains
The tools are quota-aware. If a batch is about to run into a Namecheap rate limit, the server returns a partial result plus the remaining quota instead of just failing hard. That matters because AI agents are clever right up until they hit an annoying edge case.
SQLite as the Whole Backend
This was one of my favourite decisions. SQLite in WAL mode handles concurrent reads and writes perfectly well for a single-server app. The schema covers:
| Table | Purpose |
|---|---|
| domain_cache | Cached availability results with 7-day TTL |
| auth_users / auth_sessions | Google OAuth users and hashed session tokens |
| auth_pkce | Temporary PKCE verifiers during OAuth flow |
| mcp_api_tokens | Bearer tokens for MCP authentication |
| namecheap_api_calls | Rolling-window quota counters |
| _admin_row_tags | Tag overlay system for domain annotations |
The tag system uses an overlay pattern. The core domain_cache table stays clean. Tags and notes live in separate tables and get joined at query time, which keeps the cache simple and lets the annotation system evolve without rewriting the core model.
Namecheap API and Caching
The Namecheap API allows checking up to 50 domains per request. The service layer:
- Normalises and deduplicates the input domains
- Checks the SQLite cache for unexpired entries (7-day TTL)
- Only sends cache misses to the Namecheap API
- Stores fresh results back in the cache
Popular domains get checked once and then come from cache for a week. The quota system tracks rolling windows so the app stays within Namecheapâs limits without guesswork.
Architecture
The project follows a modular monolith pattern with strict feature isolation:
core/ # Shared infrastructure (auth, config, logging, router, SQLite, MCP)
feature/ # Self-contained business domains
home/ # Domain checker UI
domaincheck/ # Domain availability service
explore/ # Domain explorer with filtering and tags
mcp/ # MCP management UI and token management
adminsql/ # SQLite admin interface (super admin only)
auth/ # Login/logout/callback handlers
deps/ # Dependency injection (single constructor)
ui/ # Shared templ components and layout
There are no cross-feature imports. Each feature registers its own routes via an APIProvider interface, and shared logic lives in core/. If I want to remove a feature later, it should be mostly a delete-and-unplug job, which is exactly how I like it.
Key Takeaways
Technical:
- SQLite is genuinely good enough for single-server apps. WAL mode makes it practical
- Opaque tokens are simpler than JWTs when you already have a database
- MCP is a nice way to make your app available to AI agents without building a separate API
- Datastar with Go templates eliminates the frontend build pipeline entirely
Product:
- Building on a third-party API means respecting rate limits and caching aggressively
- Progressive enhancement matters. Not everything needs JavaScript to work
- An admin SQL interface is surprisingly useful during development and debugging
Future Work
Features:
- Bulk domain purchasing directly from the app
- Domain name generation suggestions (AI-assisted brainstorming)
- Price comparison across registrars
- Saved searches and watchlists for domains that become available
Architecture:
- Deploy publicly with proper domain and SSL
- Add more registrar APIs beyond Namecheap for price comparison