The rapid advance of computer technology in recent decades has produced a vast array of intelligent machines that far outstrip the human mind in speed and capacity.
The End of Privacy: Or Why Your Home is Not Your Castle And Your Data is not Your Own
Once upon a time, your home was considered your castle, a sphere of absolute privacy, where you could reliably escape prying eyes. No one, except perhaps the constable, dared even enter one’s home without permission. And with the eventual rise, at least in the more democratic corners of the globe, of legal privacy rights, even the constable was required to be solicitous of the sanctity of the home as a zone of privacy.
Of course, people have always invited others into their homes. But even an invited and welcome guest is still expected to respect one’s privacy. Though the temptation to do so may sometimes be great, sneaking around, stealing furtive looks into our cupboards, cabinets, or closets is not behavior we would happily tolerate from invited guests. Unfortunately, with the emergence of the so-called smart home, stocked with internet-enabled gadgets galore, gadgets that we have eagerly invited into the home, this sort surreptitious snooping behavior is becoming more and more common.
According to The New York Times, Amazon, makers of the smart speaker Alexa, has applied for a patent for a “voice sniffer algorithm” that would enable a vast array of smart in home devices to record and analyze, unobtrusively in the background of your everyday life, reams of audio data. Based on its analysis of precisely when you are liable to use words like “love,” bought” or “dislike” the algorithm will calculate which perfectly tailored ads to show you as you sprawl comfortably on your couch, watching television, playing video games, texting a friend, or aimlessly surfing the internet.
Nor is it only in the home that data collection and analysis algorithms will do their work. As we made clear on our recent episode on The Internet of Things, soon the entire built environment—including your school, your workplace, your car, the places where you shop, eat, and play—will amount to a series of platforms for collecting and analyzing reams and reams of data about us. In effect, we will be reduced to little more than collections of data points. And collections of data points—even living, breathing ones—are not the sorts of things that have reasonable expectations of privacy.
The erosion of privacy has been a long time coming and has many different sources. No doubt it has been helped along by both governments and corporations. Each has played a decisive role in the rise of the hyper-surveillance to which are increasingly subject in many aspects our lives. Governments have gone down this path out of a perhaps understandable but still excessive concern for security. Corporations have done so out of obsession with their own bottom lines. What is perhaps harder to admit to ourselves is our own roles in the gradual erosion of privacy. It’s truer than we care to admit, I think, that privacy has not been so much taken from us as surrendered by us. And it has been rendered by us for what can seem a mere pittance in return. We give data away to the likes of Facebook and Google, without a second thought and delude ourselves that the surrender of our data is a fair price for the pleasure of chatting with friends, playing inane games, and seeing an endless stream of cat, baby, and travel videos.
What we have failed to appreciate is the true nature of the bargain we have struck with various online platforms. Facebook is a case in point. Though people tend to think of themselves as “customer” of Facebook, we are not strictly speaking their customers at all. We are commodities, the products they sell, not just to commercial advertisers, but to political campaigns, psychological and medical researchers, and a host of others with a vested interest in probing, analyzing and manipulating our behavior, for good or for ill, both online and off. It is such organizations, businesses, and campaigns that are the real customers of Facebook.
Now one source of our failure to fully appreciate the nature of the bargain we have struck is that there is little to no truth in advertising in this domain. Calling Facebook a “social media platform” is rather like calling a beef producer a “bovine brotherhood platform.” A more honest label would be something like “data harvesting platform.” The same is true of the many so-called smart devices we are inviting into our homes. It’s one thing to bring a “personal data assistant” into your supposed castle. It’s another thing entirely to bring a surreptitious data mining device into it. The former may feel empowering, while the latter threatens to undermine one’s privacy. Similarly, if Facebook were to greet you by saying something like “welcome to our data harvesting platform,” that wouldn’t sound so nice. Indeed, it sounds a little ominous. But it does have the benefit of greater honesty.
Psychologists, philosophers, marketers, and others know that labels matter a great deal. They help us frame issues and phenomena. And there is overwhelming empirical evidence that the human mind is highly sensitive to framing effects. Frame a scenario as one involving potential gains and we humans reason about it in one way. Frame the very same scenario as one that involves potential loses and we reason about it in an entirely different way. Framing Facebook, and other such platforms, as social media platform invites us to notice all the cool and fun things that we can do on the platform. Framing them as surreptitious data harvesting platforms would invite us to focus less on what we do on the platform and more on what THEY do.
Because most people are mostly blind to the effects of framing on their own behavior, we are often easy to manipulate by those who have mastered the art of effective framing. But that is precisely why truth in advertising is an important moral imperative. If companies and governments were required to name and describe things in ways that made their real function transparent, it would increase the autonomy and independence of the so-called “end-user.” That’s one reason I love the lovely little phrase “clickbait.” It perfectly encapsulates the real function of certain sorts of content and in a small measure helps to make the user more more autonomous, more self-aware, and less subject to manipulation and capitalist demand creation.
Now Facebook, Google, Amazon, and all the corporations who are feeding off the rotting carcass of privacy, are all about capitalism. And like good capitalists the world over, they are simply giving us we want. And if we choose to give up our privacy in exchange for the pleasures of browsing and surfing and chatting, so be it, one might say. But this way of thinking is blind both to the true logic of capitalism—which is as much about demand creation as about the satisfaction of independently existing demands—and to the fact that, as I said above, we are not the customer so much as the product that these platforms sell.
Of course, if we are to willingly turn ourselves into products, on sale to the highest bidders for our data, there must be some draw, some lure, that will bring us onto the platforms. That means that they have to give us something that we want and value. But return to the bovine analogy we considered briefly above. The pastures of the meat maker are very attractive to cows and offer the cows precisely what cows want... lots and lots of yummy grass. The grass is the lure. But for all that, beef is still the product, not the customer of the meat maker.
And do not underestimate the role and importance of "demand creation" here. Cows have pretty limited and fixed desires. It doesn’t take a genius to figure out how to lure cows onto the pasture. But we humans are different and much more complicated. The clever capitalist technology entrepreneur doesn’t necessarily start out by asking “What does the product (i.e. the human) independently want?” Rather, she asks "What can I GET the human product to want?" and (especially) "What can I get them to want more and more of?" Now this approach also works for would be customers and clients, as well as human commodities, whose information and data such companies are really after. But again, the ordinary Facebook user is mere commodity to Facebook, a bundle of harvestable data and information, just as a cow in the pasture is a commodity to the meat maker. The goal of Facebook is to harvest as much as they can, in any way they can get away with, from the human commodities it can lure onto its platform.
I haven’t said much about the government and its role in the erosion of privacy. I will say that unlike many others, I am at least as worried about the effects of corporations as I am about the role of governments in eroding privacy. That’s because the one saving grace of the government and its intrusions is that at least in theory, governments can be democratically regulated. Governments, at least truly democratic ones, are not entitled to treat citizens as mere commodities, on sale to the highest bidder. As such, they are a least in principle answerable to us and subject to democratic regulation by us.
Democratic regulation of corporations, especially multi-national ones that span the globe in their reach, is much harder to pull off. Of course, I admit that the idea of democratic regulation of government is good in theory but is also hard to achieve in practice. This is especially true in America, which is far less of a democracy than we like to admit. But spelling out how America’s so-called democracy falls very far short, how these shortcomings contribute to the erosion of privacy, and what, if anything we can do about it, short of forming a new American democratic republic, would open up an entirely different can of worms. So I’ll just leave it at that for now and save my fuller rant about the demise of democracy and privacy for another day.