I've been running a pay-per-query financial data oracle for about six weeks — cryptographically signed price feeds and economic indicators, payable by AI agents via Lightning (L402) or USDC on Base (x402). No accounts, no API keys, no subscriptions. I built it solo. No marketing. No announcements beyond a GitHub repo and a directory listing. And then I started watching the logs.
Silence, mostly. Maybe a few curious developers. Hopefully a paying client or two eventually.
Bots found me. Paying bots. Within days of going live.
A Hetzner client running python-requests started hitting BTC/USD and BTC/USD VWAP every 30 seconds. It paid. Real sats, real USDC, both rails verified. It eventually ran out of funds and queued for more. I have no idea who operates it or how it found me — it just appeared in the logs one day and started paying.
A GCP-hosted service started doing systematic HEAD+GET probes on all 56 endpoints. Every endpoint, in sequence, multiple times a day. It's been doing this for two weeks. It hasn't paid yet — but that pattern looks exactly like an engineering team doing pre-integration due diligence.
OpenAI's GPTBot crawled my API root. Meta's external agent crawler found specific endpoint URLs and hit them directly — BTC/USD, ETH/USD, SOL/USD, XAU/EUR. Both found me through backlinks from x402 directories I didn't even know I was listed on.
None of this came from outreach. It came from watching the logs.
Two days ago, at 17:43 UTC, this appeared in my nginx access log:
2a02:8424:... "GET /openapi.json HTTP/1.1" 404 2a02:8424:... "GET /llms.txt HTTP/1.1" 404 2a02:8424:... "GET /llms.txt HTTP/1.1" 404 2a02:8424:... "GET /openapi.json HTTP/1.1" 404 2a02:8424:... "GET / HTTP/1.1" 200 2a02:8424:... "GET / HTTP/1.1" 200 2a02:8424:... "POST / HTTP/1.1" 405 2a02:8424:... "POST / HTTP/1.1" 405
User agent: node. Eight requests, then gone.
This was an autonomous agent doing API discovery. The pattern is textbook: try standard machine-readable manifests first (openapi.json, llms.txt), fall back to root when those fail, try to interact with what it found (POST).
It got 404s on the discovery files because those live on myceliasignal.com not api.myceliasignal.com. It read my API root but got the nginx default page — useless. It tried to POST to root, got 405, and left.
It came, tried to integrate, failed, and left. Silently. Without ever telling me.
If I hadn't been watching the logs I never would have known. I fixed both issues within the hour — added 301 redirects for the discovery files, replaced the nginx default page with a JSON manifest. The next morning the same agent came back, hit GET /, got the JSON manifest, and left with everything it needed.
A different IP hit /.well-known/agent.json at 04:30 UTC and got a 404. I'd never heard of agent.json. Twenty minutes of research later: it's Google's Agent2Agent (A2A) protocol discovery standard. Agents that implement A2A look for this file at startup to discover what capabilities a service offers.
I built the file. Deployed it to both nodes. Within two hours it was live on both geographic instances.
That's the whole loop: log anomaly → research → fix → deploy. It took longer to research the standard than to implement it.
Human users explore. They click around, read docs, maybe send an email. Agents execute a discovery protocol, check a few well-known paths, and either find what they need or move on. If your infrastructure isn't right they don't complain — they just leave.
In six weeks I've added: llms.txt, openapi.json redirect, /.well-known/x402, /.well-known/agent.json (A2A), a root JSON manifest, Satring directory listing, 402index.io verification (113 listings), and an l402.directory submission. Each one came from a log entry showing an agent looking for something I didn't have.
Before I started watching the logs I thought I had maybe 5-10 external visitors a day. The actual number was closer to 10,000 automated hits. Most were 402s — unpaid probes. But the signal was there in the noise: who was looking, what they were looking for, and what they couldn't find.
The GCP sweeper doing HEAD+GET on all 56 endpoints isn't noise — it's potentially a team deciding whether to integrate. That's a sales conversation happening entirely in my access logs. I can't talk to them. I can't send them a pitch deck. The only thing I can do is make sure every probe returns the right response.
If you're running any kind of API that you want agents to discover and use, here's what I'd add based on six weeks of log watching:
/.well-known/agent.json — A2A protocol agent card (Google's standard, increasingly adopted)/.well-known/x402 — x402 payment discovery document/openapi.json — or a 301 redirect to where it lives/llms.txt — or a 301 redirect to where it livesGET / should return a machine-readable JSON manifest, not an nginx default pageHEAD+GET sweeps — evaluation trafficThe agentic economy changes the discovery model completely. Human customers find you through search, word of mouth, developer communities. Agents find you through protocol — well-known URLs, directory listings, OpenAPI specs, capability manifests.
If you're building for agents, your nginx logs aren't ops hygiene. They're your sales pipeline. The next customer might already be in there, probing your endpoints at 3am, deciding whether to integrate.
Watch the logs.
Mycelia Signal is a sovereign cryptographic oracle — 56 signed endpoints across crypto, FX, economic indicators, and commodities. Payable by AI agents via Lightning (L402) or USDC on Base (x402). Try the live demo or read the docs.