← Signals

Your Nginx Logs Are Your Sales Pipeline (In the Agentic Economy)

I've been running a pay-per-query financial data oracle for about six weeks — cryptographically signed price feeds and economic indicators, payable by AI agents via Lightning (L402) or USDC on Base (x402). No accounts, no API keys, no subscriptions. I built it solo. No marketing. No announcements beyond a GitHub repo and a directory listing. And then I started watching the logs.

What I Expected

Silence, mostly. Maybe a few curious developers. Hopefully a paying client or two eventually.

What Actually Happened

Bots found me. Paying bots. Within days of going live.

A Hetzner client running python-requests started hitting BTC/USD and BTC/USD VWAP every 30 seconds. It paid. Real sats, real USDC, both rails verified. It eventually ran out of funds and queued for more. I have no idea who operates it or how it found me — it just appeared in the logs one day and started paying.

A GCP-hosted service started doing systematic HEAD+GET probes on all 56 endpoints. Every endpoint, in sequence, multiple times a day. It's been doing this for two weeks. It hasn't paid yet — but that pattern looks exactly like an engineering team doing pre-integration due diligence.

OpenAI's GPTBot crawled my API root. Meta's external agent crawler found specific endpoint URLs and hit them directly — BTC/USD, ETH/USD, SOL/USD, XAU/EUR. Both found me through backlinks from x402 directories I didn't even know I was listed on.

None of this came from outreach. It came from watching the logs.


The Moment That Changed How I Think About This

Two days ago, at 17:43 UTC, this appeared in my nginx access log:

2a02:8424:... "GET /openapi.json HTTP/1.1" 404
2a02:8424:... "GET /llms.txt HTTP/1.1" 404
2a02:8424:... "GET /llms.txt HTTP/1.1" 404
2a02:8424:... "GET /openapi.json HTTP/1.1" 404
2a02:8424:... "GET / HTTP/1.1" 200
2a02:8424:... "GET / HTTP/1.1" 200
2a02:8424:... "POST / HTTP/1.1" 405
2a02:8424:... "POST / HTTP/1.1" 405

User agent: node. Eight requests, then gone.

This was an autonomous agent doing API discovery. The pattern is textbook: try standard machine-readable manifests first (openapi.json, llms.txt), fall back to root when those fail, try to interact with what it found (POST).

It got 404s on the discovery files because those live on myceliasignal.com not api.myceliasignal.com. It read my API root but got the nginx default page — useless. It tried to POST to root, got 405, and left.

It came, tried to integrate, failed, and left. Silently. Without ever telling me.

If I hadn't been watching the logs I never would have known. I fixed both issues within the hour — added 301 redirects for the discovery files, replaced the nginx default page with a JSON manifest. The next morning the same agent came back, hit GET /, got the JSON manifest, and left with everything it needed.


The Next Morning

A different IP hit /.well-known/agent.json at 04:30 UTC and got a 404. I'd never heard of agent.json. Twenty minutes of research later: it's Google's Agent2Agent (A2A) protocol discovery standard. Agents that implement A2A look for this file at startup to discover what capabilities a service offers.

I built the file. Deployed it to both nodes. Within two hours it was live on both geographic instances.

That's the whole loop: log anomaly → research → fix → deploy. It took longer to research the standard than to implement it.


What the Logs Have Taught Me

Agents don't browse. They probe.

Human users explore. They click around, read docs, maybe send an email. Agents execute a discovery protocol, check a few well-known paths, and either find what they need or move on. If your infrastructure isn't right they don't complain — they just leave.

Your discovery stack is your storefront.

In six weeks I've added: llms.txt, openapi.json redirect, /.well-known/x402, /.well-known/agent.json (A2A), a root JSON manifest, Satring directory listing, 402index.io verification (113 listings), and an l402.directory submission. Each one came from a log entry showing an agent looking for something I didn't have.

The silence is misleading.

Before I started watching the logs I thought I had maybe 5-10 external visitors a day. The actual number was closer to 10,000 automated hits. Most were 402s — unpaid probes. But the signal was there in the noise: who was looking, what they were looking for, and what they couldn't find.

Pre-integration evaluation looks like traffic.

The GCP sweeper doing HEAD+GET on all 56 endpoints isn't noise — it's potentially a team deciding whether to integrate. That's a sales conversation happening entirely in my access logs. I can't talk to them. I can't send them a pitch deck. The only thing I can do is make sure every probe returns the right response.


The Practical Checklist

If you're running any kind of API that you want agents to discover and use, here's what I'd add based on six weeks of log watching:

Discovery files — serve these at your API domain

  • /.well-known/agent.json — A2A protocol agent card (Google's standard, increasingly adopted)
  • /.well-known/x402 — x402 payment discovery document
  • /openapi.json — or a 301 redirect to where it lives
  • /llms.txt — or a 301 redirect to where it lives

Root response

  • GET / should return a machine-readable JSON manifest, not an nginx default page
  • Include: name, description, docs URL, openapi URL, health endpoint, sample endpoint
  • This is the fallback when everything else fails — and agents do fall back to it

Directory listings

  • Get listed on Satring, 402index.io, l402.directory, and 402.pub if you're running L402/x402 endpoints
  • Automated crawlers find these directories and follow the links
  • That's how OpenAI and Meta found me — not outreach, backlinks

Watch your logs for

  • 404s on well-known paths — an agent looking for something you don't have
  • Systematic HEAD+GET sweeps — evaluation traffic
  • Unknown user agents — new directories and frameworks constantly emerging
  • POST attempts to unexpected endpoints — agents inferring your API structure

The Bigger Point

The agentic economy changes the discovery model completely. Human customers find you through search, word of mouth, developer communities. Agents find you through protocol — well-known URLs, directory listings, OpenAPI specs, capability manifests.

If you're building for agents, your nginx logs aren't ops hygiene. They're your sales pipeline. The next customer might already be in there, probing your endpoints at 3am, deciding whether to integrate.

Watch the logs.


Mycelia Signal is a sovereign cryptographic oracle — 56 signed endpoints across crypto, FX, economic indicators, and commodities. Payable by AI agents via Lightning (L402) or USDC on Base (x402). Try the live demo or read the docs.