Bran (Brandon) Myers
Infrastructure · Personal · April 2026

The Night I Replaced a Vendor Database with a Signed Chain

MongoDB Atlas was being terminated. The entire API depended on it. Every encrypt, every decrypt, every provenance record — all of it routed through a vendor database in Virginia that was about to disappear.

The three public demo pages were broken. The cross-node decrypt that made the mesh look real was an illusion — it only worked because all four nodes shared one Atlas cluster. Three of four public endpoints were returning 502.

That was the starting state.

What happened next was not planned. It was one of those sessions where you start fixing one thing and the architecture reveals itself as you go.

First: break the MongoDB dependency. Write a DuckDB backend for the provenance system. Embedded. Local disk. Zero latency. No vendor. Register it in the rotor so the system degrades gracefully — if Atlas dies, DuckDB handles everything.

Then: realize that Atlas was the silent mesh. Without it, each node is an island. Encrypt on Helsinki, decrypt on Singapore fails because Singapore never saw the provenance record.

So build mesh replication. HTTP fan-out to all peers on every encrypt. Synchronous — not eventual consistency. When the encrypt response returns, the shield token exists on every reachable node.

Then: authenticate it. Mesh secret. 401 without it. Distribute to all four nodes.

Then: make it persistent. Volume mounts. DuckDB survives container rebuilds. Prove it — encrypt, destroy, recreate, decrypt. Data survives.

And then the observation that changed everything: this can totally be a chain.

It kind of already was — content-addressable shields, geographic redundancy, tamper-evident encryption at rest. What was missing was the actual chain structure.

So we built it. Every origin encrypt is now stamped with a monotonic sequence number, a pointer to the previous record, a SHA256 hash linking to its predecessor, and an Ed25519 signature from the originating node.

Each node has its own persistent Ed25519 keypair. Volume-mounted. Identity survives container rebuilds. Peers discover each other’s public keys lazily and cache them.

Forged chain entries return 403. A verification endpoint walks the chain backward, recomputing every hash. If any link is tampered with, it reports exactly where the break occurred.

The hardest bug was subtle. ChaCha20-Poly1305 computes its authentication tag from the metadata. My first draft added chain fields to the same metadata dict before storing it. When decrypt reconstructed the metadata for verification, it included the chain fields — which were not present during encryption. The cipher rejected the tag as tampered.

The data was not tampered with. The authentication was correct. The bug was that I changed the shape of the metadata between encrypt and decrypt.

Fix: augment a shallow copy on the way in. Strip chain fields on the way out. Chain-internal code uses a separate function that returns the full record.

That bug took longer to find than the entire mesh replication layer took to build. That is how systems work.

The final state: four Hetzner nodes plus Render. All healthy. All public URLs returning 200. Every record hash-linked and Ed25519 signed. Four unique node identities. Cross-node decrypt with zero delay. Persistence verified through container destruction. Forgery rejected. Mesh authenticated.

We went from a dying vendor dependency to a self-hosted, signed, tamper-evident, append-only chain in one sitting.

And the database underneath it all? It is glyphs. Unicode characters from 180 language traditions. All the way down.

I am writing this from the arched window in Kielce. The servers are quiet. The mesh is replicating. Austin is asleep.

Atlas is down right now. Actually down — NXDOMAIN on the SRV record. The DNS does not resolve from any location on earth.

And nothing broke. Because we do not need it anymore.

That is the whole point.

← All Writing