Building a Spotify Quiz Game from the Passenger Seat: What AI-Assisted Development Actually Looks Like
Building a Spotify Quiz Game from the Passenger Seat: What AI-Assisted Development Actually Looks Like
There's a popular music game called Hitster that works like this: scan a QR code, Spotify plays a song, your friends guess what year it came out, and everyone sorts cards on a timeline. It's simple, it's addictive, and it costs about €25 for a deck of cards.
One developer decided: why buy it when I could build my own version? The catch? They were a passenger on a highway drive from Austria to Germany, armed with a phone, a home server, and Claude AI as their coding partner.
The result is a fascinating case study in what modern AI-assisted development can actually accomplish—and where it still hits walls.
The Constraints Were the Point
This wasn't a contrived experiment. It was a genuine attempt to solve a problem under real limitations. No laptop. No keyboard. No ability to Google API docs while squinting at a phone screen. Just voice messages to Claude via Telegram, a home server handling execution, and a willingness to iterate based on what appeared on the mobile screen.
The stack that emerged was intentionally minimal: vanilla HTML, CSS, and JavaScript. Spotify's PKCE authentication flow (which requires no backend). QRCode.js for generating codes. Html5-QRCode for scanning them. Nothing bloated. Nothing unnecessary.
This is actually a lesson in constraint-driven design. When you can't easily reference libraries or documentation, you choose tools that are battle-tested and well-documented. You avoid exotic dependencies. You keep complexity predictable.
Three Hours Until Your First Win
By the time the car pulled into a Burger King for a break, the core functionality was working. The developer could scan a QR code on their phone and actually hear music from their Spotify playlist start playing.
Think about that timeline: concept to working prototype in three hours, entirely through voice commands and remote file execution. The Spotify authentication alone—which typically involves navigating developer consoles, understanding OAuth flows, and debugging token exchanges—worked on the first real test.
The secret wasn't magic. It was effective communication. The developer told Claude exactly what they wanted to build. Claude asked clarifying questions, chose appropriate tools, and executed. When the developer hit a wall (needing to register as a test user with Spotify), they asked a human question and got a human answer. The AI didn't need to guess.
Where the Workflow Actually Shined
Remote error reporting changed the debugging game. Testing on Android Chrome meant no DevTools console. The typical workflow would be: make a guess at what's broken, modify code, reload, see if it works, repeat. Instead, Claude suggested adding client-side error reporting that sent all JavaScript errors back to the server. Suddenly the developer could simply describe what they saw ("the button doesn't respond") and Claude could read the actual error messages. Debugging became almost as fast as having DevTools open—except the developer never had to read a single stack trace themselves.
Authentication just worked. Spotify's PKCE flow is designed to work without a backend server, which meant zero infrastructure complexity. The developer grabbed a Client ID from the Spotify Developer Console, handed it to Claude, and login functionality was implemented. A couple of iterations to add the developer as a test user, and it was done. This is a great example of technology choices that enable remote-first workflows.
Tailscale Serve enabled instant mobile testing. Instead of wrestling with port forwarding or firewall rules, Claude executed tailscale serve --bg --https=8443 http://127.0.0.1:8080 and the app was immediately accessible on the developer's phone through their tailnet. No documentation lookup. It just worked. This is the kind of infrastructure tool that makes AI-assisted development feasible.
UI design that didn't require a designer. Claude's training on design systems and UI patterns meant the rough interface didn't stay rough. The developer got something that looked polished and intentional without having to learn Figma or argue about color palettes.
And Then Reality Showed Up
The Spotify API broke silently. The playlist endpoint was returning 403 Forbidden errors despite correct authentication and scopes. Claude dug through the error logs from the browser and eventually discovered that Spotify had changed their API in February 2026: the /v1/playlists/{id}/tracks endpoint had been replaced with /v1/playlists/{id}/items. Two lines of code to fix. But it took several iterations to diagnose because the error wasn't obvious from the surface. This revealed something important: AI can be good at systematic debugging when it has error logs, but it's not magic. It still needs to reason through problems step by step.
Then came the legal wall. With the game nearly functional, the developer asked Claude: "Why hasn't anyone released a public Hitster clone?"
The answer came back immediately: Spotify's Developer Policy explicitly forbids building games with their API. Section III is clear: "Do not create a game, including trivia quizzes."
That's not a licensing loophole. That's not something you work around. That's a straight-up prohibition. Hitster itself likely has special licensing terms with Spotify. The reason there's no popular public clone isn't that it's technically difficult—it's that it's contractually forbidden.
This is a crucial lesson about AI-assisted development: tools can make it trivially easy to build things that violate terms of service. The technical lift has dropped so far that legal and business constraints are now often the actual bottleneck.
The Workflow That Made It Work
Voice messages to Claude Code via Telegram. Tailscale for instant network access. The --dangerously-skip-permissions flag so Claude could execute file writes and shell commands without waiting for approval from a passenger seat. That last detail matters: you can't approve every action while driving. The trust model had to shift.
Voice in, code out, reload on phone. Three tools creating a feedback loop that was just tight enough to work from a car.
What's Missing (And Probably Matters)
The client-side error reporting workaround works, but it's not ideal. Chrome's DevTools Protocol could give Claude direct access to the browser's console output, network requests, and DOM state without building custom logging. That would be closer to a real development environment.
More broadly, this project is limited by what can run on a home server with a public IP and a Tailscale tunnel. Build something that needs cloud infrastructure, and you're back to dealing with CLI tools, environment variables, and complex deployments.
And of course, the ToS wall means this particular game idea stays private—a proof of concept for friends, not a product.
The Real Takeaway
This isn't a story about AI replacing developers. It's a story about how much faster iteration becomes when your bottleneck shifts from "finding the right library" and "deciphering documentation" to "clearly communicating what you want."
The developer still had to understand what they were building. They had to recognize the Spotify legal issue when Claude flagged it. They had to know which tools (Tailscale, PKCE, vanilla JS) would work best under these constraints.
What changed is the time between "I have an idea" and "here's a working prototype." From days to hours. From hours to a car ride.
That's not magic. It's what happens when you combine better tools with clear communication and reasonable constraints.
Interested in building projects faster? NameOcean's Vibe Hosting and AI-powered development tools are designed for exactly this kind of rapid iteration. Whether you're prototyping a new idea or scaling production infrastructure, we've got the network, the domains, and the tooling to move at speed.