The UX developer role has always been hard to explain to everyday folks. “So you design stuff?” Not exactly. “So you code stuff?” Sort of. For years, people who sit between design and engineering have fought for a seat at a table that wasn’t really built for them. Then AI showed up and flipped the whole thing.
Being a UX developer comes with a built-in identity crisis. You’re not designer enough for the design team and not engineer enough for the engineering team. Your title changes every two years depending on which LinkedIn trend is peaking. “UI developer.” “Design technologist.” “Frontend engineer.” The work stays the same. You’re the person translating between two groups that speak different languages, making sure what gets built actually matches what was intended.
For most of my career, that translation work has felt undervalued. Orgs didn’t know where to put us, so they just kept reorganizing until it was someone else’s problem. We’d get shuffled between departments. Left off project kickoffs. Skipped in planning. Then someone would ask us to justify why our role existed at all, because “the designers can just hand off specs and engineers can just build them.” Sure. And you can also throw a football at someone’s face and call it a pass.
What AI actually needs
Here’s what’s changing. AI tools can generate UI code at a speed that would’ve seemed absurd three years ago. You can describe a component in plain English and get something functional back in seconds. That’s impressive, and it’s only getting better.
But generating code isn’t the hard part. It never has been. The hard part is generating the right code, code that respects your design system’s API conventions, uses the correct tokens instead of hardcoded values, follows your accessibility patterns, and fits into the architecture your team has been building for years. AI doesn’t know any of that unless someone teaches it.
That “someone” is the person who already understands both sides. The person who knows why the design team chose a specific spacing scale and how the engineering team implements it. The person who can look at AI-generated output and immediately spot that it’s using the wrong component variant or ignoring a semantic token that exists for exactly this use case.
That’s the UX developer. That’s the bridge role. The one that your org still can’t figure out where to put on the chart.

The new leverage
What’s wild is that the day-to-day work of a UX developer, writing component documentation, defining token taxonomies, and building usage guidelines, is now directly feeding AI systems. The documentation you write becomes the context window. The component API you designed becomes the constraint that keeps generated code on the rails. Those design decisions you encoded into tokens? That’s the vocabulary AI uses to make choices.
This isn’t theoretical. I’ve been building MCP servers and Claude Code skills that let AI tools interact with our design system directly. The whole exercise is bridge work. You need to understand the design intent deeply enough to encode it as rules, and you need to understand the engineering architecture well enough to make those rules actually useful in a development workflow. Strip out either side, and the whole thing falls apart.
The UX developer role went from “nice to have” to a force multiplier. When AI can generate code but can’t generate judgment, the person who provides that judgment is steering the ship. You’re not slowing things down. You’re the reason the output is actually usable.
Stepping into it
I’m not writing this to celebrate a victory; it’s obviously still wild out here. The role still presents the same challenges it has always had: ambiguity, peculiarities in the organizational chart, and ongoing explanations. However, the leverage has changed. If you’ve spent years building the connection between design and engineering, you possess the exact skill set that enables AI to be useful rather than just quick.
Take ownership of it. Define how AI interacts with your systems. Create documentation that serves as the training context. Develop the tools that ensure the generated output aligns with your team’s standards. It turns out we’ve been building this bridge the whole time. The only thing that changed is that everyone, including the robots, now needs to cross it.
Leave a Comment