The Ethics Crisis: AI-Powered Dark Patterns

Image: Ai dark patterns manupilating a person
(Read Time: )


The pursuit of growth is running over the pursuit of meaning in technology. Designers are trading empathy for algorithms and swapping genuine research for automated A/B tests. This isn’t just a drop in quality, it’s a collapse in values. The technology we build is meant to serve people, not trick them, yet Artificial Intelligence is blurring that line faster than we can redraw it.

Dark patterns used to be easy to spot: misleading buttons, pre-ticked boxes, and endless “subscribe now” traps. But AI has taken this old trick and given it steroids. Machine-learning models don’t just optimise colours or copy; they learn your hesitation, your fatigue, your weakest click. They adapt to your behaviour in real time. That isn’t personalisation, it’s predation. When persuasion becomes predictive, it stops being voluntary.

But let’s all be a person of reason for a moment. I do understand the counter-argument. They’ll tell you that technology should be useful to a business and its people. That ‘adults should know better’ and that these models are just ‘helping us make guided decisions’ on products we might need. They’ll argue it’s not a privacy intrusion, it’s just ‘learning our patterns’, that the better they sell, the better the products we get. It’s the ‘Win-Win’ economy that gives us free services and market efficiency.

I’m telling you, that is a convenient fiction. The idea that an ‘adult‘ is a perfectly rational actor is rubbish. The entire advertising industry is built on the fact that we aren’t. To say the ‘flaw is subjective‘ is a moral cop-out. It’s the con artist’s defence. When you arm that con artist with an AI that has A/B tested a billion simulations, it is no longer a fair fight. It is exploitation. And the ‘Win-Win’ you’re hearing about? It’s a fantasy. The user’s goal is to find an answer. The platform’s goal is to maximise engagement. The advertiser’s goal is to trigger a conversion. These goals are not aligned; they are in direct conflict:

Your goal as a user is efficiency. You have a problem or a question. You want the best, most accurate solution as fast as possible, with zero friction, so you can close the app and get on with your life. Your perfect experience is frictionless, fast, and finite. You want to get in, get your answer, and get out.

The platform’s (like Google, Facebook, YouTube, or X) goal is to maximise engagement. It needs to keep you on the platform for as long as possible, measured in minutes and hours. More screen time equals more ad impressions, which equals more revenue. This is the first direct conflict. Your goal is to leave. The platform’s goal is to make you stay.

This is the first direct conflict. Your goal is to leave. The platform’s goal is to make you stay.

The platform is not designed to give you your answer and let you go. It’s designed to trap your attention. It does this with infinite scrolls, “recommended for you” sidebars, autoplaying videos, and notifications. Its success is directly proportional to how long it can delay you from finishing your original task.

Do you honestly believe an algorithm trained to maximise engagement will also respect privacy or mental bandwidth? Of course, it won’t. This isn’t clever UX, mates… It’s the industrialisation of deception. We’ve built systems that are too effective at doing the wrong thing, and worse, we’ve normalised it under the excuse of “data-driven design.

Regulators are finally catching up. The EU’s Digital Services Act and the FTC’s updated guidelines have turned dark patterns from ethical grey zones into legal red lines. Meta’s 2024 penalty for algorithmic manipulation made it clear: compliance is no longer optional. I will only trust a company that treats these laws with respect, not as inconveniences. Businesses clinging to manipulative models aren’t just morally bankrupt, they’re inviting their own collapse. A business built on trickery is a business waiting for its headline.

The real tragedy is that most designers aren’t malicious; they’re trapped inside incentive systems that reward clicks over conscience. OKRs measure engagement, not integrity. Empathy has been replaced by efficiency, and our tools are beginning to dictate our values. To practise ethical design today requires the courage to challenge your own data. To pause a release when your gut says, “This feels wrong.” To be the dissenting voice in the room when everyone else is looking at conversion charts.

We need to return to the idea that UX should stand for user experience, not user exploitation. Our authority as designers comes from our ethics, not from clever prompts that help AI bypass them. If UX was once defined by empathy, then AI ethics must be defined by restraint. The challenge is simple: prioritise autonomy over growth. Because if we fail to learn from the past, we won’t just repeat it, we’ll automate it.

The pursuit of growth is running over the pursuit of meaning in technology. Designers are trading empathy for algorithms and swapping genuine research for automated A/B tests. This isn’t just a drop in quality, it’s a collapse in values. The technology we build is meant to serve people, not trick them, yet Artificial Intelligence is blurring that line faster than we can redraw it.

From Outcry to Action

Condemning unethical design is easy. Changing the system that rewards it is not. The reality is simple: ethical design might slow your quarterly metrics, but deception will sink your business in the long run. The only sustainable strategy is one that balances profit with principle. Here’s how every part of the ecosystem can start.

For Designers: Speak the Language of Risk

You won’t win an argument by calling something “evil.” You will by calling it “expensive.”
Executives speak risk and revenue, so speak in their language.

  • Build an Ethical Framework: When a feature feels off, don’t rely on gut instinct. Create a one-page “Risk-Based Pushback” matrix. Map every proposal against FTC and EU DSA guidelines. Label potential violations as you would security risks.
  • Quantify the Cost: Don’t say, “This is a dark pattern.” Say, “This mirrors the one that cost Meta $X million and could lose us Y% in retention next quarter.” Numbers travel farther up the chain than moral appeals ever will.
  • Redefine Success: Replace engagement vanity metrics with integrity metrics. Track Informed Consent Rate, Non-Regretted Retention, or Time-to-Trust. Measure what you actually value, not just what looks good on a dashboard.

You built your incentive system. You can redesign it. Trust isn’t sentimental, it’s a business strategy.

For Companies: Treat Trust as an Asset, Not a Slogan

  • Acknowledge the Trade-Off: Ethical growth is slower, but it endures. Users who trust you don’t churn. They advocate. They forgive honest mistakes. That’s compounding interest money can’t buy.
  • Prove It Works: Document case studies of companies that thrive because of integrity, not despite it—subscription services that make cancellation effortless, privacy-first tools that charge fairly. Their “Win-Win” model proves that transparency can scale.
  • Firewall the Designers: Legal teams are separated from sales for a reason. Apply the same principle to design. Protect UX from short-term growth hacking. Their primary OKR should be User Trust and Success, not Engagement Minutes.

For Users: Starve the System

You are not powerless. You are the system’s fuel. Every click, scroll, and swipe is data. Withdraw it.

  • Spot the Pattern: If a product feels too “free,” you’re the commodity. If cancelling takes ten clicks, you’re being trapped. If you leave an app feeling anxious, it’s exploiting you.
  • Vote with Your Wallet: Pay for the tools and creators you value. Support the platforms that earn your trust instead of renting your attention.
  • Vote with Your Data: Block trackers, use privacy-first browsers, and deny data-sharing permissions. Every byte you withhold makes predatory algorithms dumber.

The Real Bottom Line?

This isn’t about moral purity. It’s about survival. Technology built on manipulation is not innovation, it’s extraction. The companies that will endure are the ones building relationships, not dependencies.

Ethical design is not anti-growth. It’s the only form of growth that lasts.

Leave a Reply