← Case Studies

Ship the Right Thing

How I replaced a waterfall requirements machine with a twice-weekly discovery process that aligned five engineering disciplines before a line of code was written.

TL;DR

  • OTA1 failed because five engineering disciplines each built their own interpretation of a single-sentence feature description — nobody had synced
  • Launched twice-weekly Discovery Workshops: all five disciplines in the room before anyone starts building, with a structured process and an AI-assisted PRD output
  • Format was adopted by two other product lines at Gentex
  • The backlog went from a wishlist to a real planning tool with cross-functional buy-in baked in from the start

The Situation

After launch, we had a Product Definition Summary with a list of features we intended to build. Single bullet points, each one a sentence or two. The intent was always that these would get fleshed out collaboratively before anyone started building. That’s not what happened.

The waterfall machine at Gentex got ahead of itself. Systems engineers started writing detailed requirements from those bullet points months before implementation was planned, without input from UX, mobile, or cloud. The firmware team, conditioned by years of automotive process, treated those requirements as baselined and started implementing against them. Meanwhile, UX and mobile had developed their own interpretations of the same features. Nobody had synced.

When we finally brought the work together, the misalignment was obvious. Three teams had built three different mental models of the same feature. Requirements had been written, reviewed, and baselined for things we were about to pivot away from entirely. The OTA1 process was messier than it needed to be, took longer than it should have, and left the organization less confident in its ability to keep pace with user feedback.

The bullet-point PDS wasn’t the problem. Treating bullet points like finalized requirements was.

What I Did

Stopped waiting for a process and built one

I advocated for a new approach and, honestly, mostly just started doing it. Twice a week, for an hour, I pull the key subject matter experts from each discipline into a session: UX, mobile, cloud, systems, and firmware. I come in with a feature idea grounded in something real, a pattern from user reviews, a complaint surfacing repeatedly in Reddit threads, a gap the support team keeps seeing.

Then I let the room work.

Each discipline gets to interrogate the idea from their angle. Is this actually what users need, or are we solving the symptom? Is this feasible on the current architecture? What are the edge cases? What’s the simplest version that would be worth building? We stay away from implementation details, but we leave with an agreed-upon approach that every discipline has had a hand in shaping.

The culture in the room matters as much as the output. These sessions have developed their own inside jokes and shorthand. Engineers who might otherwise work in isolation are talking directly to the UX designer about what users experience. That cross-discipline fluency doesn’t show up in a PRD. It shows up in how the team thinks about problems.

Turned workshop output into working documents

After each session, I take the agreed-upon approach and run it through a custom AI command I built for exactly this purpose. The output is a structured PRD with consistent sections: Customer Pain Point, Out of Scope, Constraints, Functional Requirements, Acceptance Criteria, Epics, and Estimates.

The Acceptance Criteria and Epics feed directly into development tickets. The Estimates section gives us enough signal to right-size releases before committing to a sprint. The goal is to build up a backlog of completed, estimated PRDs so that when we plan a release, we’re choosing between well-understood options, not scrambling to define things under deadline pressure.

What Changed

  • Five-discipline alignment before implementation replaced after-the-fact reconciliation
  • Feature definitions now carry cross-functional buy-in from the start, which means fewer surprises mid-sprint and fewer requirements changes under pressure
  • The backlog has become a real planning tool, not a wishlist
  • The Discovery Workshop format has been picked up by other product lines at Gentex, though I’m told the meetings are less fun

What I’d Do Differently

Start this before OTA1. I know the first post-launch release would have gone better. We would have caught the misalignment earlier, shipped faster, and built the organization’s confidence in the process sooner. The battle to prove the ROI on continued PLACE investment would have been easier to win if the first release out of the gate had gone cleanly.

The next step is a quarterly outcomes cadence: defined goals for each quarter, with a backlog of PRDs ready to go before the quarter kicks off. The Discovery Workshops give us a way to define work well. What they don’t yet give us is a forcing function for deciding which work matters most right now and how we’ll know if we shipped the right thing. That’s the gap I’m working on closing.