After years of building, maintaining, and supporting in-house design systems with real tokens, governance, versioning, support, and contribution models, I recently found myself building with Tailwind and shadcn. Thanks, AI!
I’m familiar but relatively newish to these tools, so take this for what it is. I’m not anti-AI. I use it every day. But I’ve been watching something happen across the industry that I think is worth talking about: It feels like AI is making our UI decisions for us, and we’re just going along with it. Maybe call me old school, but I think products should have their own soul when it comes to UI and UX. Otherwise, what actually sets us apart?
These days, someone fires up a vibe coding tool and builds a prototype. Could be a manager who’s excited about AI, a designer who started dabbling in code, a dev who wants to move fast. It comes back looking polished. React, Tailwind, shadcn, the whole default stack. It feels fast. It looks professional. And everyone accepts it. The UI direction, the component approach, the framework, the styling, all of it decided by whatever the AI defaulted to that day. By a language model optimizing for the most common output.
From what I can tell, this is happening more and more. And I don’t blame anyone for being excited. These tools can be fun to use. Designers can build something real without waiting on engineering. Decision-makers can see a working prototype in hours instead of weeks. That feels like progress, and in some ways, maybe it is. But there’s a difference between using AI as a tool and letting AI make the decisions. Right now, a lot of people are doing the second thing and calling it the first.
Your product now looks like everyone else’s
When you let AI decide your UI, everything converges. You can spot a shadcn app from across the room. The spacing, the rounded everything, the muted grays, the specific way the buttons and inputs feel. It’s the default aesthetic. Every vibe coding tool out there generates the same stack, so every prototype comes back looking like a cousin of the last one. That’s what happens when a model decides.
And there’s a feedback loop making it worse. AI generates shadcn code. More shadcn code ends up in training data. AI gets even better at generating shadcn code. Round and round. Your product’s look is being decided by what a language model saw the most during training, not by your brand team or your designers. Not by anyone who’s talked to your users. Super inspiring stuff.
If your product looks identical to your competitor’s product, what exactly is your differentiator? UI and UX used to be a competitive advantage. The way your app felt was part of why people chose it. When every SaaS dashboard looks like it came from the same AI prompt, that advantage disappears. Your product becomes a commodity before you even ship it.
shadcn released theming presets to address this. But swapping a color palette isn’t brand identity. It’s a skin. Your UI is part of your brand, and your UX is how people experience it. If your application looks and feels like every other AI-generated output, you’ve handed both to a default setting. Honestly, from a design standpoint, it feels lazy. We used to obsess over the details that made a product feel like ours. Now we’re accepting whatever the AI spits out because it looks clean enough. And it’s hard to even raise the concern because the AI made it look so good out of the box that questioning it feels like you’re the one slowing things down.
Choosing what’s right vs. what’s common
This is the part that gets lost. People see a polished AI-generated UI and assume the tool made a good decision. It didn’t decide at all. It predicted the most likely output based on its training data. shadcn and Tailwind are everywhere in that training data, so that’s what comes out. It’s not a recommendation. It’s a statistical echo.
But people are treating it like a recommendation. A manager vibe-codes a dashboard and thinks, “this is the direction.” A designer builds a prototype and assumes the stack is solid because the output looks professional. Nobody questions whether React is the right framework, whether Tailwind is the right styling approach, or whether shadcn is the right component strategy for their product and users. The AI picked it. It looks good. Ship it.
That’s not a design system either
On top of letting AI make UI decisions, people are also calling the output a design system. It’s not.
AI and shadcn are giving you components. It doesn’t give you governance, contribution models, versioning strategy, token architecture, or cross-team documentation. It doesn’t give you a shared language between design and engineering. What it gives you is a folder full of React files you copied into your repo.
A design system is an organizational tool. The hard part is never “make a button look nice.” The hard part is making sure multiple teams use the same button the same way, that it evolves without breaking things, and that someone owns the decision about what “primary” means across your entire product suite. shadcn doesn’t try to solve that. It’s not designed to. And that’s fine for what it is. But calling it a design system? That’s like calling a pile of lumber a house.
215 lines of React for a radio button
I get why shadcn exists. Building accessible UI primitives from scratch is hard. Getting focus management, keyboard navigation, and screen reader support right takes real expertise. That’s the problem shadcn and Radix are trying to solve, and it’s a legitimate one. But the solution has costs that nobody’s weighing because the AI never brings them up.
Paul Hebert recently tore down the shadcn radio button component. 215 lines of React. Seven imported files. 30 Tailwind classes. All to recreate something HTML has done natively for 30 years.
Instead of using <input type="radio">, shadcn renders a <button> with an SVG circle inside it, then uses ARIA attributes to tell screen readers it’s actually a radio button. Read that again. It’s a button pretending to be a radio button and relying on ARIA to cover for the fact that it didn’t just use the native element. The browser already solved this. Decades ago. But the AI doesn’t know that, and nobody asked.
Zoom out, and the abstraction stack is wild. Tailwind abstracts CSS. shadcn abstracts Radix. Radix abstracts the DOM. React abstracts the DOM. Four layers between your user and a checkbox. The AI chose every single one of those layers for you. It didn’t ask whether your team needs that complexity or whether a simpler approach would work. At enterprise scale, every one of those layers is a maintenance surface and a possible debugging headache. This is complexity cosplaying as simplicity.
There’s another cost nobody seems to talk about. Teams using Tailwind for long enough start forgetting actual CSS. The muscle memory goes away. When something breaks outside the utility class catalog, nobody knows how to fix it. You’ve traded foundational knowledge for convenience, and that trade gets expensive when things go sideways. The AI forgot to mention that part when it generated the code.
You’re locked in now
shadcn is React. Community ports exist for Vue and Svelte, but they’re unofficial, maintained by different people, with different APIs and different release timelines. If your enterprise has teams on Vue, Angular, Svelte, Rails, or plain JS, you’re now maintaining parallel component implementations or forcing everyone onto React whether they chose it or not.
Here’s what nobody notices: the AI chose React. Not your team. The prototype worked, it looked great, and it naturally became the starting point for real production code. Nobody went back to reconsider the foundation. Why would they? It worked in the demo.
Enterprise environments are messy. Legacy apps, acquired products on different stacks, internal tools built by teams who picked their own framework years ago. A design system needs to serve all of them. shadcn can’t.
What happens when React isn’t the dominant framework anymore? It will happen. jQuery was untouchable once. So was Bootstrap. If your UI is coupled to React’s ecosystem and release cycle, you’ve made a bet on one framework’s future. That bet has an expiration date.
The maintenance and governance
When you use shadcn, you copy source code into your repo. You now own every component. Bug fixes, accessibility patches, and breaking changes upstream are all your problem now. There’s no upgrade path. When shadcn updates, you manually diff and merge. Across how many apps? How many teams?
Tailwind without strict governance turns into class soup fast. Every team invents its own patterns. One team uses px-4 py-2 button padding, another uses p-3, a third wraps it in a custom class. Multiply that across hundreds of components and a dozen teams. Good luck with consistency.
Without shared tokens, “our blue” is three different hex values in three different repos. Without versioning, a component change in one app silently breaks patterns in another. Without contribution models, nobody knows who owns what or where the source of truth lives. That’s the work that goes into building a real system. None of it comes in the box with shadcn.
Nobody budgeted for any of this, by the way. Because nobody planned to adopt shadcn as the foundation. An AI picked it, a prototype got everyone excited, and now multiple teams are building on a decision that was never actually made.
So who’s actually deciding what our products look like?
That’s the question I keep coming back to. Vibe coding makes it easy for anyone to build something that looks production-ready, and that’s exciting. But “looks production-ready” and “is the right UI for your product” are not the same thing. Right now, AI is paving over that gap with polished output that feels like a decision was made when it wasn’t.
AI is great at generating code. It is not great at understanding your product, your users, or how your org works. It optimizes for “most common,” not “most appropriate.” And it will never tell you “yo, this might not be the right approach for what you’re building.” It just gives you shadcn and moves on.
Your UI, your UX, and your design system are business decisions. They touch brand, velocity, how teams work together, and how much technical debt gets carried for years. AI didn’t think about any of that when it decided what your product should look like.
I’m curious what other folks are seeing. Are we just accepting whatever UI the AI gives us now? Should vibe-coded prototypes quietly become our production UI? I might be wrong about some of this. I’d love to hear if others are seeing this and how they are navigating it.
Leave a Comment