0:00
/
0:00
Transcript

#162: Matt D Smith – Your AI Edge is The Vocabulary You Already Have

What a decade of design fundamentals taught me about delegating to Claude Code—and why Shift Nudge was secretly an AI onboarding course before AI existed.

Matt D. Smith is the founder of Shift Nudge, a professional interface design training platform for working designers. Rising to prominence in the 2010s for his systematic approach to visual interfaces, he became known for turning over 20 years of interface design practice into a structured curriculum used by thousands of designers worldwide. His work on design patterns and tools has made him a widely regarded figure in modern interface design education.

Previously, as founder of Shift Nudge, he built a global program that helps designers advance their careers in as little as 8–12 weeks while receiving mentorship and support for a full year, equipping them to lead teams and ship production-quality interfaces. He became known for transforming working designers’ income trajectories, with students reporting income growth of 2x within a few years by applying his methods in typography, layout, and spacing. Through Shift Nudge, he has trained designers from leading startups and global brands, positioning the program as a modern alternative to traditional design education.

His career highlights include pioneering the Float Label pattern in 2013, a form interaction now adopted across products from Apple, Google, and countless consumer applications. He also created the interface design tools Contrast and Flowkit, Figma plugins that have reached tens of thousands of users and are used to check color contrast and design user flows inside modern design tools. Beyond product work, he has served as an adjunct professor at the University of Georgia and delivered workshops and talks at conferences including Adobe MAX, Dribbble Hangtime, Figma’s Config, Smashing Conference, and others, extending his influence from the classroom to stages across the United States.

Listen to episode 163 on Apple Podcasts↗ and Spotify↗

What a decade of design fundamentals taught me about delegating to Claude Code—and why Shift Nudge was secretly an AI onboarding course before AI existed.

“I have a weird obsession with trying to get the absolute most difficult username across every platform,” Matt says, and it lands like a confession. He goes by MDS on the internet. Three letters. You can imagine the negotiations, the dead accounts, the patience required.

We’re a hundred episodes into knowing each other—he was guest number 50, and now here we are past 150—and he’s still introducing himself as someone in transition. “I’m a designer turned educator now sort of turning into a CEO trying to figure out how to run a design education business.” There’s something in how he says trying to figure out that earns the pause that follows.

I’ve watched Matt’s public work for almost a decade. I was the third beta tester to graduate from Shift Nudge back in 2020. I bought low, as I like to say. The course has appreciated since then, but so has something else—something I didn’t understand I was learning until AI came along.

When Claude Code got good enough to actually help with design tasks, I noticed I could delegate effectively while other people couldn’t. The difference wasn’t technical skill. It was vocabulary. Every time I’d tell Claude to “adjust the row height” or “try a card component instead of a list view,” I was drawing from a library of concepts Matt had codified years earlier. Those concepts weren’t just design rules. They were the building blocks of clear instruction. The most valuable thing I learned from Shift Nudge was the vocabulary.

When I became a design lead, I could articulate with precise vocabulary what wasn’t working in someone’s design. Subject, object, verb. The spacing is off for this reason. That precision made me good at delegating to humans. Now it makes me good at delegating to machines in the form of Skill files to AI agents.

Matt nods slowly. “Skill files,” he says, “they’re good at getting directionally correct, especially things that are like absolutely binary. Is this the way you write an HTML link or is it not? It’s definitely right and wrong.” He pauses. “Whereas design... there’s more gray area than black and white.” Capturing the nuance a missed in my observation.

He’s talking about Claude’s skill files—those markdown documents that give AI context about how you want things done. And he’s right that they work best for the binary stuff. But here’s the connection he helped me articulate: skill files are functionally identical to the Standard Operating Procedures you’d write for a junior designer.

I bring up The Defiant Ones, the documentary about Dr. Dre and Jimmy Iovine. When Jimmy was learning to be a record producer, his mentor taught him by working through him. “Adjust the reverb. What happened there? Why did that work? Why did that not work?” It’s the master-apprentice model, I say. And I think that’s where things are going with AI.

Matt leans into it. “You still need that institutional knowledge, the vocabulary. AI can adjust the reverb and adjust the echo and adjust the panning. Oh, you want five different beats? But it’s like—why? How much? When do we stop?” He lets the questions hang. “That creativity... I think there’s gonna be, you know, in the same way that there was a big resurgence of live in-person things after COVID—I think we’re all gonna be like, it’s just refreshing when I read something online and I can tell that a human wrote it.”

There’s something in his voice when he says refreshing. Like he’s already tired of the alternative.

I ask about the divergence he sees coming—who wins, who loses. He doesn’t hesitate.

“There’s gonna be a divergence where the person who doesn’t use AI is just simply not as effective as the person who learned how to use it. But then there’s also gonna be a divergence of—I’m using AI all the time and this other person is like, well, I learned a lot of things before AI existed and I use AI and now I know more than you.” He pauses. “And this other person’s just fully reliant on AI and they don’t know much.”

It’s gonna be harder to learn things, he says, because AI is so instant. “It makes it like painful to sit down and read something and actually learn it yourself.”

The irony is that the people most equipped to leverage AI are the ones who invested in their own brains before these tools existed.

Matt has a framework for mapping where you fit in all this. He calls it Pioneer, Builder, Consumer.

Pioneers are the people at Anthropic and Cursor and OpenAI—building the intelligence and the harnesses that give it to us. Builders are the developers and designers using these tools to create products. “We’re sort of converging slowly,” he says. “Designers are over here and developers over here, and some are still better at infrastructure and setup and code—like, oh, that’s why would you use useEffect here in React—and designers over here like, what does that mean? But it’s starting to be irrelevant because some of the tools are getting so good.”

And Consumers? “My mom is a good example,” he says. “She’s not choosing to have AI in her life. She’s just seeing it happen through Amazon review summaries or Google AI summaries for the things that she used to search for.”

The question isn’t whether AI will touch your life. It’s which persona you want to occupy.

I push on the vibe-coding hype. All those people on Twitter saying software is cooked because they built a Facebook clone in five minutes.

“I don’t wanna rely on your janky vibe coded app to help me,” Matt says, and there’s a dry humor in it.

I have a follow-up I’ve started using. Whenever someone says “I did this with AI”, I ask: Cool. So what’s your plan to maintain it?

They never have an answer. That’s when you realize why we pay engineers. DevOps, infrastructure, support tickets—that’s the unglamorous work that keeps the train running. Building something on your own is a lot different than supporting a hundred thousand users at once.

Near the end, Matt gets reflective about advice. “You’re gonna need your own knowledge,” he says. “Build that vocabulary through any means possible. Whether it’s asking questions from AI while you’re learning, or watching videos, or attending school. I think there’s still real value in you building your own brain.”

He catches himself. “And if you don’t want to do it—you know, maybe you change careers. I don’t know.” Something shifts. The pragmatism cuts through.

“Just kind of plot yourself,” he says. “Are you a pioneer? Are you gonna be a builder? Are you just gonna be a consumer? Because either way, AI is gonna be touching a part of your life, whether you choose to or not.”

I’ve been thinking about this since we talked. I’m reading books again—not AI books, the fire hose has enough of those. I’m building vocabulary in domains outside tech: marketing, strategy, positioning.

The cost of building has collapsed. The cost of deciding what to build has not.

Everyone with taste is not in tech right now. It’s in the humanities, philosophy, long-form content.

That’s where I’m looking.


The Way of Product w/ Caden Damiano is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Discussion about this video

User's avatar

Ready for more?