Money (That’s What They Want)

February 13, 2019

Brenda McPhail
Director of Privacy, Technology & Surveillance Project
bmcphail@ccla.org

 

 

 

 

It shouldn’t come as a shock to anyone that the collection of our private, personal information by companies is entirely out of commercial self-interest–but I read a couple of news stories not so long ago that reminded me we need to call it out. Why? Because there are increasing signs that the corporate PR is working, that many of us are buying into the fiction that some mildly curated content is a fair exchange for the granular details of our lives, be they trivial or intimate.

The story was about two bereaved women who each suffered from marketing targeted to them as expectant or new mothers after the loss of their babies. It came out very near the time I also read a Guardian interview with Shoshanna Zuboff, a Harvard business prof, discussing her long-awaited book about the dominant business model on the internet, “The Age of Surveillance Capitalism”.

Juxtaposed, these two things encapsulate much that is wrong with the ways our personal information is being collected, used, and abused by the companies we deal with both online and in the physical world. It’s worth noting that on and offline is a distinction which is rapidly becoming meaningless when it comes to data collection, as bricks and mortar stores are actively leveraging new technology like facial analysis to collect data about us too.

The first story has a title that kind of says it all: “‘No right to make money off us that way’: Woman targeted by baby product marketing after miscarriage.” The article describes the experience of two women. The first shared her contact information and due date with a maternity store she liked and trusted but reportedly wasn’t told that her information would be shared with other stores and companies who sold baby products. She miscarried, a sad loss that she was still dealing with when a box of baby formula samples arrived at her door from a company she’d never done business with.

The second woman recently wrote an open letter to “tech companies” berating them for being quick to target advertising towards her when she shared her excitement over being pregnant on social media but failing to stop even though she also posted about her heartbreak when her son was stillborn.

Marketers might classify this as “personalization”–the promise that in exchange for information that has value to marketers, we get things we want instead of things we don’t—ads, recommendations, coupons, whatever. When we’re asked for permission to collect and share our information, on a website, app, or store loyalty program, the language used to convince us is about relevance, about personal benefits. The last time I said no to such a pop-up permission box, it warned me sternly that I was choosing to see ads that might not meet my needs. The tone was very much like a warning a stern parent might give a child who was choosing to be bad instead of good.

These women’s stories lay bare the lie that these practices are truly about us and our needs or desires. If they were, someone should have made sure to explain to the expectant mother sharing her due date that the store would share or sell her information to other baby-oriented companies and given her the option to say no. If they were, the same tools used to compile information about the woman sharing her joy about being pregnant and sell that information to interested parties would be designed to update those lists, maybe even issue a proactive warning to purchasers of that data, when she posted about losing the baby.

But that latter point raises another critical question—would it really make it better if companies collecting information about us as we browse, post our thoughts, or shop online paid even closer attention than they already do? It might make the promises of personalization feel more genuine, but is that ask, which is essentially the one the bereaved mother made in her letter, the right one to make or does it just open up the possibility of even better real-time tracking of our online/offline behaviour? I’d argue the latter. It makes me sad to think that we’ve become so acculturated to the idea that someone else deserves to use the information we create when we socialize online that asking for better surveillance feels like a reasonable option.

And here we come to the link to surveillance capitalism. As Professor Zuboff explains it in her book, surveillance capitalism is essentially the monetization of the data that we willingly share, but also the data we create as we navigate life online—the “data exhaust” that offers insights into how we currently behave, and with enough of an accumulation and a bit of analysis, predictions about how we might behave in the future.

Her research makes it crystal clear. It’s not about benefiting us, it’s about making money—which of course is what capitalism is all about. And it’s a lot of money at stake: Alphabet (Google’s parent company) reported annual income in 2018 of 30.74 billion US dollars (a 142.74% increase over 2017). Further, at this point it’s not just the data goliaths like Google or Facebook who operate on this model, it’s essentially every single internet of things device (products labelled “smart” “networked” or that pernicious word “personalized”) that is designed to make money when you buy it, and then more money as you use it and contribute to a data stream that the company can use or sell, or both.

These practices started because we weren’t paying attention, and continue because we haven’t had the will, or created the regulatory muscle, to stop them. Businesses, even governments—including the Canadian government, in materials from the recent federal consultation on data strategy—are working really hard to convince us that our data is the price we have to pay to support innovation and create ongoing economic opportunities. They sometimes compare it to the “new oil” but that metaphor is profoundly misleading. Data is not a natural resource created by the decay of dead organisms over time, it is of us and about us, created as we live our lives. Data is not created passively by natural processes over time, it is created actively when we interact with systems designed to scoop it up, and with companies who deliberately claim the right to use it–and are winning the battle to convince us they deserve it.

Making people complicit in their own surveillance, as an ongoing project on big data hosted at Queen’s Surveillance Studies Centre points out, raises “ethical questions, political concerns and moral challenges” that go well beyond data and privacy to “penetrate the core of modern democratic principles.” The pain of the two mothers, faced with harsh reminders of their loss, serves as a sad but valuable warning that we lose more than we gain when we fail to question the systems that are not just eroding our privacy rights, not just seeking to manipulate our behaviour, but telling us it’s all for our own good. We need to recognize that claim for the swindle it is because it’s causing demonstrable harm. We deserve an online world that respects our rights and is structured around fair information practices that allow non-exploitative business to thrive while also benefiting us, as consumers, citizens, members of society. It’s time to stand up against the algorithmic wiretapping of our personal lives.