A breakdown of Die Forward — the design, the process, and what happens when you treat AI as your entire production pipeline.
I've been in tech since 1996. I've shipped products, led teams, built on all platforms. But I hadn't built a game from scratch. I wanted to know how far AI tools could take a solo builder in a two-week hackathon window.
Not vibe-coding a prototype. A complete, polished, shippable game.
The game was the vehicle. The experiment was the point.
Games have always been where we push limits. Not because games are frivolous -- because they demand everything at once.
Doom and Quake didn't just sell copies in the 90s. They forced the hardware industry to build faster GPUs. NVIDIA's first consumer GPU was built for gaming -- and that same architecture now runs AI models. Multiplayer gaming invented what we now call the internet backbone for consumer use. World of Warcraft ran real-time distributed systems at a scale enterprises couldn't match for another decade. The iPhone's touchscreen was partly pushed to market by gaming demand. Dark Souls invented an entirely new social mechanic -- asynchronous multiplayer, players leaving traces for each other -- that no other medium had tried.
Games push UX, technology, and imagination because they're hard. They require graphics, audio, real-time systems, narrative, economics, and social mechanics -- all working together, all judged instantly by the player. There's no hiding behind a spreadsheet or a backend dashboard. It either feels right or it doesn't.
So if you want to test a new toolset, build a game. It will find every gap.
I wanted to test whether AI tools could handle the full production stack -- not just code, but content, audio, visuals, video, and marketing. A game would stress-test all of it.
The rule was simple: AI handles execution, I handle design and direction.
Script? AI writes it under my direction. Sound effects? AI generates them from prompts I write. Music score? AI composes it. Visual identity? AI generates everything from prompts I craft. Pitch video? AI writes the React code, AI voices the VO, AI composes the music. Social media launch? AI-drafted copy, AI-generated images.
I used Claude for world-building and content generation, ElevenLabs for all audio (VO, SFX, and original music), DALL-E for the entire visual identity, Remotion with AI-generated React components for the pitch video, and Anchor/Solana for the on-chain layer.
The result: Die Forward. A web3 roguelite where dying is the feature. Built in about two weeks, solo.
Here's how each piece came together.