2026-02-24

Anna @ AMassive

The Tech Industry Won’t Save Us

There are two stories being told about the tech industry right now…

The first is the hype machine: AI will cure cancer, end poverty, let us talk to our pets and fold your laundry. The future is frictionless, abundant and just one more funding round away.

The second is the doom spiral: platforms are rotting from the inside, algorithms are eating democracy and the enshittification of everything is inevitable and irreversible. We’re all going to die alone and penniless, looking at a screen.

Both stories are wrong. Or rather, both are way too dramatic to be true.

That’s the thing about big narratives… they’re easy to sell and hard to escape. And while I’m not saying the 21st century hasn’t been a frickin’ batshit wild ride thus far….cos duh…. I suspect the real story is going to be a lot less cinematic than either camp wants it to be.

Mostly, it’s systems doing what systems do.
And people doing what people do inside of them.

That’s what we’re actually dealing with. Yup, it’s shit. And it’s harder to fix. At least knowing it’s nobody’s masterplan makes it easier to breathe. Oddly.

So if there’s no villain, what is there?

Systems, my friend. We are surrounded in systems.

The big, the small, the day to day, the natural, material, societal, professional, relational, familial, personal systems that make this world spin. There are so many of them doing their do at so many different levels that we just can’t track them all. Their inputs, outputs, componentry, interconnections all moving and grooving to beats usually beyond our control, if we were indeed clued in enough to see them at all.

Systems are great. They get things done. But there are a few fundamental problems:

Systems can be hard to see clearly

And how you view a system changes based on where you’re looking at it from. The observer’s perspective, position and method of measurement dictate which aspects of the system are highlighted and which are suppressed.

So a system that keeps people dependent on food banks and charity may look good but leaves beneficiaries stressed, isolated and unable to plan long-term. The system appears functional or broken depending entirely on where you stand inside it — and crucially, on who gets to decide which position is the correct one from which to measure.

Systems can have unintended consequences

YouTube built a recommendation engine optimised for one thing: watch time. Longer viewing meant more ad revenue – clean, simple, logical. What nobody planned was that the algorithm would learn from billions of clicks and watches, that increasingly extreme content kept people watching longer.

Nobody sat down and designed a radicalisation pipeline. They built an engagement engine. The pipeline was what happened next. Just an incentive structure, quietly working, at scale.

Systems can be gamed by people who know how

Complex systems have levers, the points where a small input produces a bigger output. Most of us don’t know where they are. Some people do. They know exactly where to press. Some of them have the means to make it pay.

Consider what happens when a senior politician drops, in public, that back-channel talks with a hostile nation might be closer than anyone thought. Calls get made. Money moves and markets follow. By the time the news cycle catches up and the lies exposed, the damage is already done. Those who moved first have already cashed out.

That’s not a masterplan either. It’s just a bunch of operators working the oldest market manipulation trick in the book.

This is what we’re actually navigating. Not evil. Not incompetence. Just systems doing what systems do… and, occasionally, someone who knows exactly which key to press.

📐 Viewfinder: Frame of Reference How you see a system depends entirely on where you’re standing inside it. The observer’s position, perspective, and method of measurement determine which parts of the system are visible and which are hidden. There is no view from nowhere. The question worth asking is always: whose position is being treated as the neutral one — and why?

📐 Viewfinder: Incentives Drive Behaviour People respond to reward structures, not intentions. This explains more than malice ever could.

📐 Viewfinder: Second-Order Effects Solutions create new problems. Technology deployed at speed generates consequences nobody planned for — and some people profit from anyway.

So why can’t the industry fix this itself?

Industries don’t self-regulate. Not because the people inside them are uniquely corrupt. No single company will eat costs that leave their rivals laughing to the bank. Someone has to force the whole sector to move together. In tech? That hasn’t happened.

This isn’t about bad people making bad calls. It’s about structures that bake certain outcomes right into the system, no matter what anyone intends.

Start with the money. Most of the tech companies shaping your daily digital life were built on venture capital. And VC doesn’t want healthy or sustainable. It wants a unicorn, a blockbuster hit. Which means the product gets pushed past what’s good for you. Because “good for users” and “good for investors” aren’t even in the same universe.

📐 Viewfinder: The Blitzscaling Trap VC-backed companies can’t plateau. Growth or death. The product will always be pushed past what’s good for users because “good for users” and “good for investors” are structurally different goals.

📐 Viewfinder: The Exit Imperative Most founders are building for acquisition or IPO, not longevity. You are inheriting someone else’s exit strategy.

Even if a company survives long enough to go public, it doesn’t get easier. Public companies have a legal obligation to prioritize shareholder returns. Not users. Not society. Shareholders. Trusting a publicly traded tech company to act in your interest isn’t naive. It’s a category error. You’re asking an entity with a legal duty to someone else to put you first.

📐 Viewfinder: Fiduciary Duty to Shareholders Public companies are legally obligated to prioritise shareholder returns. Not users. Not society. Trusting a public tech company to act in your interest is a category error.

Then there’s the product itself. If you’re not paying for it, you are not the customer. You are what’s being sold. Your attention, your behaviour, your data. Packaged and transferred to people willing to pay for access to you. The relationship has a direction, and it isn’t toward you.

📐 Viewfinder: The Free Product Problem If you’re not paying, the business model is built around someone else’s willingness to pay for access to you.

Because attention is the scarce resource, the metric that drives every design decision is time-on-platform. Not your wellbeing. Not your understanding. Not your satisfaction. Just your time. The product that keeps you engaged longest wins, regardless of what that engagement costs you.

📐 Viewfinder: The Engagement Trap Time-on-platform is the metric. Not your wellbeing. Not your understanding. Just your time.

📐 Viewfinder: Regulatory Arbitrage Tech companies scale faster than regulators can respond — intentionally. By the time the rules catch up, the damage is done and the money is made.

📐 Viewfinder: The Pivot When a product stops being profitable it gets pivoted, sold, or shut down. Your dependency on it is not their problem.

📐 Viewfinder: Network Effect Lock-In Switching costs are a feature, not a bug. Dependency is the product.

None of this requires a conspiracy. It doesn’t even require bad faith. It just requires a system following its own logic to its natural conclusion.

Data plus behaviour equals user profile.

Consequences

So what does a system following its own logic to its natural conclusion actually look like in practice?

It looks like this. Every time you open an app, search for something, click a link, pause on a post, or leave your location services on, that behaviour gets recorded. Categorised. Added to a profile that probably knows you better than you do.

Data plus behaviour equals user profile. And that profile doesn’t sit still. It gets bought and sold by data brokers, most of whom you’ve never heard of. Their whole business is packaging and reselling you. Not your name. Your patterns. Your predicted behaviours.

None of us are anonymous online. Not really. The idea that you can opt out through obscurity is just a comforting story we tell ourselves.

The data isn’t always accurate either. Nobody fact-checks your profile. Which matters a lot when it’s used for your credit score, insurance premiums, job applications, or even police interactions in some places. Junk data causes real harm. And you have almost no way to fix it.

There’s also this. Your data doesn’t just expose you. It exposes the people around you. Location patterns, communication metadata, who you hang out with. Your digital footprint drags others into it, whether they signed up for that or not.

And it’s permanent. Data collected today will sit in systems you can’t see, for purposes nobody’s dreamed up yet, long after you’re gone. That’s not paranoia. That’s just how storage works.

The Narrative Being Pushed

Here’s the part nobody talks about.

Everything I’ve just described — the systems, the structures, the data, the extraction — generates a very particular emotional response. Overwhelm. Fatigue. A creeping sense that it’s simply too big, too embedded, too late.

And that feeling has a name: defeatism.

And it is the most useful thing the system ever produced.

Not by design. Remember… no masterplan, no puppet master. But when the natural byproduct of complexity is paralysis, and paralysis keeps people from demanding change, certain people benefit enormously from you staying exactly where you are. Overwhelmed. Scrolling. Consuming. Not translating.

The narratives that feed this are everywhere. Privacy is dead. You’ve already been tracked. The algorithms already know you better than you know yourself. What’s the point?

The point is that we all adjust our behaviour when we know we’re being watched. The point is that consent still matters even when it’s being routinely ignored. The point is that a system that benefits from your passivity will always have good reasons for why resistance is futile.

Privacy isn’t dead. Defeatism is a choice. And it’s worth asking who benefits when you make it.

📐 Viewfinder: The Outrage Cycle Attention → reaction → amplification → escalation → fatigue → repeat. Fatigue is where passivity lives.

📐 Viewfinder: Narratives Simplify Reality Stories flatten complexity so humans can process events. “Tech is evil” and “tech will save us” are both stories. Neither is the map.

What You Can Actually Do

None of this requires you to go off-grid, delete everything, or become a digital hermit. Small, considered changes to how you move through the digital world add up. Here’s where to start.

Your Browser Switch to Brave or Firefox. Stop handing your browsing history to the company whose business model depends on it. [links to browser guide]

Your Search Use DuckDuckGo. Your searches reveal more about you than almost anything else online. They shouldn’t be logged, profiled, and sold. [links to search privacy guide]

Your Passwords Use a password manager — Bitwarden, 1Password, or KeePassXC. One long, unique, random password per site. This single change meaningfully reduces your exposure to hacks, breaches, and account takeovers. [links to password guide]

Your Communications Move sensitive conversations to Signal. Use Proton for email. Consider a VPN. These aren’t paranoid choices — they’re the digital equivalent of closing the curtains. [links to communications guide]

Your Data Trail Stop clicking remember me. Use masked credit cards for online purchases. These small friction points limit how much of your behaviour gets captured and stored. [links to data trail guide]

Your News Build your own RSS feed. Break up with the algorithm as your primary news source. Curate what comes to you rather than accepting what you’re served. [links to RSS guide]

Your Awareness Privacy is about consent. Start choosing companies that respect it. [links to privacy-first services guide]

No Villains. No Saviours. Here’s The Truth About Where We Are

I’m not going to tell you that everything is fine. It isn’t.
I’m not going to tell you that one browser switch or one deleted app will fix a structural problem. It won’t.
And I’m not going to perform optimism at you — the relentless, exhausting kind that papers over actual concern with motivational poster energy.

What I am going to say is this: staying grounded right now is genuinely hard. Staying aware without becoming paralysed. Staying hopeful without becoming naive. Holding all of this — the complexity, the contradictions, the slow grind of systems that weren’t built with you in mind — and still choosing to show up and do something with it. That takes something.

But here’s what I know. Things can be alright. Not perfect. Not fixed. Not disrupted into some frictionless utopia. Just — alright. Better than they were. More honest than they were. More yours than they were.

That’s the work. Not a revolution. Not a manifesto. Just small steps, ever forward, and as many of us making them together as possible.

Here’s where we start to make it make sense.

Go Deeper

These are the people doing the critical work on tech accountability, privacy, and digital rights. Worth your time.

Podcasts

  • Tech Won’t Save Us — Paris Marx interviews experts every Thursday to examine what the tech industry is actually doing to our world. TechWontSave.Us
  • This Machine Kills — Jathan Sadowski and co-hosts on technology, power, and politics. Pairs well with the above.

Newsletters

  • Platformer — Casey Newton on platform accountability. The publication of record for how social media is reshaping the world. platformer.news
  • Blood in the Machine — Brian Merchant on Big Tech and labour. Sharp and well-reported.

Organisations

  • Electronic Frontier Foundation (EFF) — the longest-standing digital rights organisation. Fights in courts and Congress for your privacy. eff.org
  • Dark Times Academy

Books

  • Road to Nowhere — Paris Marx. The definitive critique of Silicon Valley’s promises.
  • Blood in the Machine — Brian Merchant. The history of technology and worker resistance.