Every design decision is a trust decision. When you hide a privacy setting three menus deep, you're saying "trust us." When you make a cancellation flow harder than the signup flow, you're saying "we don't trust you to make good decisions." The current state of digital design is a crisis of trust wearing the mask of user experience.
Harry Brignull coined "dark patterns" to describe interfaces designed to trick users. Confirm-shaming ("No thanks, I don't want to save money"), hidden costs, forced continuity, disguised ads. These aren't bugs. They're features. Someone designed them deliberately, tested them carefully, and shipped them because they work.
The business logic is clear: dark patterns increase conversion in the short term. The human logic is equally clear: they destroy trust in the long term. Every time I have to click "manage preferences" and manually uncheck 47 cookie categories, I'm being told that my time is worth less than the company's data appetite.
The best example I know: when Basecamp (now 37signals) redesigned their pricing page, they put the "cancel your account" button right on the main settings page. No confirmation maze, no "are you sure?" guilt trips. Just a button.
Their reasoning: making it easy to leave makes people more likely to stay. If you know you can leave at any time, you stop feeling trapped. The relationship shifts from coercion to choice, and choice builds loyalty.
Here's the uncomfortable part: full transparency isn't always better. Research by Onora O'Neill shows that excessive transparency can actually reduce trust. If a hospital publishes every internal incident report, patients might lose confidence even if the hospital's safety record is excellent. Context matters more than data.
Good trust design isn't about showing everything. It's about showing the right things at the right time, and being honest about what you're not showing and why.
Rachel Botsman's Who Can You Trust? traces how trust has shifted from institutional (I trust the bank) to distributed (I trust the Uber rating system) to algorithmic (I trust whatever the algorithm recommends). Each shift creates new vulnerabilities. We've outsourced trust to systems we don't understand.
Cory Doctorow's concept of "enshittification" describes the lifecycle: platforms start by being good to users, then shift value to business customers, then shift value to themselves. It's a trust trajectory — earn trust, leverage trust, burn trust.
Design alone can't fix broken business models. If a company's revenue depends on user manipulation, the best designer in the world can't make it ethical. Trust in design is downstream of trust in business models. The dark patterns will keep coming as long as the incentives reward them.
I've started evaluating apps by a single question: how easy is it to leave? The answer tells you everything about the company's relationship with its users. Easy exit means confidence. Difficult exit means desperation. Design for confidence.
Share your reflections on this piece
Sign in to join the conversation
Sign InNo comments yet. Start the conversation.
Recommendations based on shared topics and recent reading