Not Just Security: How Logins Extract Free Labour & Data
Once again, I found myself staring at a CAPTCHA. A bicycle—taking up most of the image. It’s blurry, the edges indistinct. The cyclist’s hand seems to be entering the adjacent box. Does that count? Was I verifying the bike or the rider? I hesitated but clicked eventually.
Then came the login prompt. I have had an account on this site for years. It’s a newspaper, not NATO. I pay for the service, but feel like a stranger, forced through verification steps that seem to serve a purpose beyond security. And, as always, the push to sign in with Google or Facebook. The email login option still exists, but it’s slower, harder to find, and more cumbersome than it used to be.
Security or Extraction?
It’s easy to accept these steps as necessary. However, beyond stopping bots, all this theatre serves another function: data collection, behavioural tracking, and AI training. Somewhere, my hesitation, my every keystroke was recorded, analysed, and added to an invisible dataset. In essence, my daily actions are contributing to the training of AI models, at no cost to the companies benefiting from this process.
CAPTCHAs as Free Labor
Those frustrating image challenges are not just filtering out bots; they are improving AI-driven image recognition. Every click refines a dataset—at no cost to the company using it.
Nudging Toward Social Logins -Platforms increasingly design login flows to favour social logins, which link a user’s data across multiple services. The small inconveniences built into traditional logins are not accidental.
Friction as a Retention Tool - Making logins slightly more difficult can also be a way to encourage users to stay logged in permanently—increasing engagement, tracking consistency, and dependency on the platform.
Trust Erosion
Individually, these tactics are small irritations. Collectively, they contribute to something more corrosive: the steady erosion of trust. Personally I’m becoming more aware of when I’m being nudged, and I don't like it. Assuming that consumers won’t notice or won’t object to these tactics is not just a mistake—it’s exploitation
Inconvenience is one thing, but systems designed in bad faith are quite another. You should ask permission, not impose. Explain the full extent of what is being collected; don’t obscure. Most of all, treat users as partners in data collection, not just as passive sources of free labour.
B2C Has Become B2C2S
What we’re seeing is a shift from traditional B2C (business to consumer) to B2C2S (business to consumer to supplier), where the consumer no longer simply purchases products or services but becomes an ongoing supplier of personal data sold on to the suppliers. Users are no longer just customers; they are things to be herded to maximum consumption, and now also uncompensated contributors to AI training, behavioural modelling, and data monetisation.
Tech retail is taking us backwards. It’s taken decades to establish fairness in B2C. Regulators have stepped in repeatedly where large companies meet individual consumers because the power imbalance is too great. We’ve tackled excessive credit, misleading advertising, and predatory subscription models—all because companies, left unchecked, designed systems that benefitted them while leaving consumers trapped.
Tech companies may argue that these systems are simply the cost of security or convenience, but they are, at their core, one-sided transactions. Users provide increasing amounts of data, time, and effort, but the benefits flow in only one direction.
The Cynicism Will Implode
At what point does this model collapse under its own cynicism? Customers are not raw material. It is not a long-term business model to profit from our data without consent or fair compensation, individually or at a societal level.
Security is necessary, but if platforms value user data this much, they should either ask transparently or pay for it. The covert extraction model, wrapped in layers of false security, is not just flawed—it’s unsustainable.
Regulation exists in consumer markets for a reason. This imbalance of power needs to be acknowledged and managed—before it becomes another era we regret, having allowed it to persist unchecked.