Monday 7 February 2022

BOOK REPORT: THE AGE OF SURVEILLANCE CAPITALISM: THE FIGHT FOR A HUMAN FUTURE AT THE NEW FRONTIER OF POWER by SHOSHANA ZUBOFF. Part Two.




 

I WROTE AN EARLIER POST discussing Shoshana Zuboff’s book The Age of Surveillance Capitalism and I’ll continue discussing it, adding additional points she makes about the changing nature of Capitalism and how these changes might affect our lives and our societies. 
ONE REVIEWER has compared Surveillance Capitalism to Rachel Carson’s Silent Spring as a book equally seminal and ground-breaking in its examination of today’s high-tech industry as was Carson’s 1964 exposé of DDT and the chemical industry’s pollution of the environment. Sam Biddle of The Intercept news site says the book’s revelations are:

“…also a masterwork of horror. It's hard to recall a book that left me as haunted as Zuboff's, with its descriptions of the gothic algorithmic daemons that follow us at nearly every instant of every hour of every day to suck us dry of metadata. Even those who've made an effort to track the technology that tracks us over the last decade or so will be chilled to their core by Zuboff, unable to look at their surroundings the same way."

 

HIGH PRAISE INDEED, AND ALSO DISTURBING, and like Biddle I’m struck by Zuboff’s laser-like focus—not only on the extent of Big Tech’s reconnaissance of our daily lives—but the utter rapaciousness with which it goes about mining every last bit of data that it can. And, as Shoshana reminds us, Big Tech does so because it must. If companies like Google and Facebook stop gathering up-to-the-minute data on virtually everyone and everything, or slow down, they will fall behind their competitors, and their predictive algorithms will become dated and less relevant to their customers who demand fresh servings of information about their target audience (which, of course, is us).

 

BUT IT WASN’T LIKE THIS at the beginning, in the late 1990s and early 2000s, when the great tech companies were just stirring from their cradles. Information technology was a new thing and organizing the ever-increasing flow of data and information was a constant challenge. Not only was data transmitted and received by users of the online technologies, for example in a search query, but extra data accumulated with the user’s action. Digitally embedded information about “the number and pattern of search terms, how a query is phrased, spelling, punctuation, dwell times, click patterns, and location” was automatically produced. Such packets of information were gathered, logged, and stored, but they were considered of little value and were “operationally ignored,” (67) and seen as essentially digital garbage. The data that was used and collated was used to improve the products and services these new companies were bringing to the market.

 

THAT ALL CHANGED AROUND 2000 when Google “data-mining” scientist, Amit Patel, speculated that the waste material or “digital exhaust” that was accumulating in company servers could provide “detailed stories about each user—thoughts, feelings, interests—could be constructed from the wake of unstructured signals that trailed every online action.” (68) These “digital breadcrumbs”, he speculated, could be used to predict user preferences and behaviours allowing Google, the earliest of the tech companies to realize the importance of such data, to sell this information to advertisers who would then target individuals, and groups with similar demographics, sending them specifically crafted advertisements. That’s how I came to receive an advertisement from Amazon only hours (probably sooner; I was napping) after I had been editing my poetry blog on-line. I 'd added several “tags” to one post—they’re like digital labels readers can use to click on and get lists of similarly-themed posts. One post I had conveniently labelled “Poetry”, which is the breadcrumb I assume Google’s algorithms harvested, then sold to Amazon which, in turn, emailed me an advertisement for a poetry “how-to” book. (Or, maybe, Google’s AI machinery read my poetry and decided I that needed immediate help with my writing!) 1


BUT WHAT SHOULD GIVE READERS PAUSE is the disturbing synergy that exists between the two tech giants: Data hoovered up at one end by Google comes out the other end at Amazon. Nearly seamless, and no doubt, if I purchased that poetry-helper book (not that I would; I have my pride!), Google will hoover up breadcrumbs from that transaction and add it to the data log they keep on me at one of their server-farms.

WHERE ZUBOFF’S ANALYSIS IS MOST HELPFUL is in showing how twenty-first century surveillance capitalism differs from nineteenth and twentieth century industrial capitalism. The working model of industrial capitalism is as follows: The capitalist gathers their raw materials to make widgets or whatever at their factory. They pay workers to manufacture them and then they sell their widgets to customers, and—importantly—they receive feedback about their product from things like sales numbers, competitor comparisons, customer surveys, consumer reports, investor confidence, repeat buying patterns, etc. This data is then used by the industrial capitalist to improve their widget, boosting sales, and out-competing their competitors. And so, the wheel spins. Any information that doesn't help directly to improve the product or increase manufacturing efficiency is discarded or ignored. This is traditional capitalism, industrial capitalism, the kind that we have known until very recently, and as shown in the above Figure 1.


TODAY’S SURVEILLANCE CAPITALIST, on the other hand, leaves that model behind, and in order to produce their product, they use raw material of a different kind—us. For we are not Google’s customers when we use one of their products like Search, says Zuboff, we are the grist for their mill. More specifically, our activities and our addresses, our amenities and adequacies; our adversities, ancestries, academies, our apologies and absurdities, our apostrophes even, perhaps, our adulteries. And that’s only the A’s! Think of all the information we routinely input when we log-in, or solve a word recognition puzzle to prove we’re not a “robot”, or complete a survey, update personal information, send a rain check, check our email, use spell-check, conduct business, pay bills, do a search, research, buy merch, etc., etc. Anything you do online (and increasingly off-line) leaves a trail of data crumbs behind. Such data was originally labelled as “data exhaust” and ignored. Today, these digital bits of ourselves are the raw material of surveillance capitalism, the gold-standard of resource material, the mother-lode. The more data that Big Tech can collect about us, from our work, community, and our home lives, the more its “machine-analytics”, those sophisticated Artificial Intelligence technologies designed to learn about the world and about us, the more they will be able to predict our behaviour. And, as a consequence, the more they will be able to shape it, as well. 


Big Tech’s profit centres are not derived from the devices and services (free and otherwise) they offer—rather, it is we--our lives and our data-- who are the base metal which the algorithmic ‘philosopher’s stone’ of Big Tech (made from its giant databases, massive computing power, and sophisticated monitoring systems) then turns into the rich trove of information they can sell and use to their commercial advantage. Figure 2, above, shows the changes to industrial capitalism’s “Reinvestment Cycle” that came with Google and other technology companies’ discovery of “Behavioural Surpluses” and the riches they contain. One man's "data exhaust" is another man's "prediction products".  

 

AN EXAMPLE OF THE INCREASING SOPHISTICATION of Big Tech’s data machinery is the near ubiquitous Search engine provided by Google. Each query I make, from spelling and word-usage, to Gibraltar’s’ neolithic cave-dwellers and their stone tool traditions, is collated and recorded by Google’s algorithms. Various “tells” or traits I may use in querying, including spelling, word choice, results clicked-on, time of day I’m actively searching, and so on, are recorded and analyzed. WHY? So that Google Search gets to know “me” better (or the “me” that is represented by the data clues I leave behind). For example, as part of its analytic machinery, it lists it's search "suggestions" below the query I type in that are based on my prior searches, some made days or even weeks earlier. If I click on one of Google's suggestion, that, too, helps Google Search’s AI to learn more about me. AGAIN, WHY? Is it because I’m such an interesting fellow? Hardly. To know me is to eventually control me. 

Along with "Google Search”, or Facebook's “likes”, or our daily news feeds, Twitter feeds, those digital health and fitness monitors we use, or our appointment calendars, the growing array of sensors, cameras, and digital assistants, like "Alexa" or "Siri"; all these gather raw data about us as we negotiate through our "on-line" and real world lives—with the ultimate goal of collating, analyzing and finally  predicting our behaviour, and then “tuning”, “herding” and “conditioning” us to act in ways that are beneficial for the bottom lines of Big Tech’s real customers: not us, of course, but rather, the corporations, governments, and elites who desire up-to-the-minute predictive analysis about people and populations.

THOSE LAST THREE TERMS (tuning, herding, conditioning)2 are straight from old-school behavioural psychology, and in particular, the work of B.F. Skinner, and they involve increasing levels of behaviour modification, from the seemingly innocuous to the downright coercive.3 Zuboff notes that Skinner, in later life, lamented that his theories for ordering human behaviour lacked the necessary mechanisms to turn them into realities. By the year 2000, the observational, analytical, computational, and operational instruments of Big Tech were making Skinner’s vision of social harmony through behavioural modification a realizable goal.


ZUBOFF DISCUSSES SKINNER’S THEORIES IN SOME DETAIL, noting with concern how many of his ideas for modifying human behaviour through operant conditioning were being adopted by this century’s technology companies. (And, it should be said, by governments as well—vaccine “passports”, with their comprehensive, and increasingly coercive, systems of monitoring and control are a recent and disturbing turn of affairs in our body politic.)  At the core of Skinner’s behavioural system is his conception of the self, the inner personality of individuals. He uses the word “other-one” to describe how he sees human beings: We are all “others” to each other. What we conceive of as our unique identities, our private, sacred selves, are results of myriad interactions with our environment and, of course, with other people. We are the sum/total, Skinner says, of everything and everyone else acting upon our individual brains and nervous systems. Of the individual, he seems to say: There is no there, there.


ZUBOFF COULDN’T AGREE LESS with such a worldview. And she argues, rigorously and passionately, against what she sees is nothing less than the operationalizing, by Big Tech, of the old behaviourist's theories. And, because of the commercial imperatives found in technology companies like Google and Microsoft, and increasingly due to governments and their monitoring and manipulating of individuals and groups, we are becoming acculturated—“tuned”, “herded" and “conditioned”—into accepting less and less autonomy and agency in our personal, home, work, and community lives. Zuboff’s book is a gauntlet thrown down to challenge those who would usher in such a world.

Well, that is enough for now. If you press the "LIKE" button below, you will be rewarded with a pellet of tasty feed!

👍

Enjoy! Jake

_______________________________________

 

1. BLOGGER (where I currently blog from) is a “free” blogging site owned by Google. Of course, it is also a trove of valuable data points for the company, which is why it’s free. The “breadcrumbs” Google sweeps up and sells to advertisers about its BLOGGER users waaay more than offsets the cost of running the digital service. If only all my words were as precious to Google!

  

2.Tuning—as a behaviour modification tool, involves “subliminal cues designed to subtly shape the flow of behaviour at the precise time and place for maximally efficient influence.” (294) For example, in a classroom, desks face the front, staging the room’s setting to direct the students’ attention towards the teacher. In terms of modifying behaviour, every little bit of “tuning” helps.

Herding—“relies on controlling key elements in a person’s immediate context.” (295) Zuboff uses the example of the “uncontract” which is the opposite of a “contract”. A contract is a social arrangement, a promise…to do something or to refrain from doing something. The making of a contract requires the mutual assent [italics mine] of two or more persons, one of them ordinarily making an offer and another accepting. (Britannica) An “uncontract” is different because it is an agreement whereby one party has a hidden advantage over the other. For example, in the (not so distant) future, an insurance company may have buried a fine-print clause in your policy allowing them to remotely deactivate your car and send out the repo-men—done automatically via machine intelligence—if you’re late paying a premium. Oopsie! Too Bad. In a real sense, you have a policy with an algorithm, and woe betide anyone who transgresses against the machine!

IT'S CLEAR THAT the “uncontract” is the true agreement between Google and the users of its service. Users are unaware of the nature of surveillance capitalism's extractive process that the company utilizes, and of the types, and the depth, its intrusions extend into private life. (And that goes for the other Big Tech companies, and increasingly old-school industrial firms and governments.) Thus, by hiding and misrepresenting their business model, Google “herds” users in the direction the company wishes to go.

INTERESTINGLY, the need for secrecy in its operations was noted very early in Google's company history. Shoshana discusses a 2001 meeting where top executives at the company discussed the Google’s business model. They asked, what type of company had they created? What was Google? In a recent interview, she describes how the idea for “surveillance capitalism” was first conceived:

 

“[Company co-founder] Larry Page mused that if it [Google] did have a category, a business category, it would be personal information. Because everything that you do, everything that you say, everywhere that you go, every place that you’ve been, every aspect of your experience is going to be searchable. Indexable. [They] will be able to know it all, and at that moment it became clear that, while Google was marketing itself as a search engine, Russell, and we thought that we searched Google, even as early as 2001, those young men understood that they had to reverse the entire process, that it was not users searching Google, it was Google searching users. The search engine worked in reverse. And that was the source of [the] breakthrough insight that founded surveillance capitalism. Because they understood, right from the start, that they could not do this in a way that people were aware of, that it had to be hidden, [italic mine] because users would not allow themselves to be searched and lawmakers would be triggered into passing privacy laws. And so, right from the start, they had something they called the ‘hiding strategy’.” Zuboff interview. Russell Brand podcast. 1 Feb 2022. 

 

Conditioning—is the third tool in the behaviour modification toolkit and it is “a well-known approach to inducing behaviour change…” (296) The behaviourist school of psychology had been around since the beginning of the twentieth century, but it wasn’t until B.F. Skinner arrived on the scene in the post-war years that the study of human behaviour—how we act, why we act the way we do and, crucially, could our behaviours be deliberately modified. 

TO THE TRADITIONAL VARIABLES of “stimulus/response” that occupied early behaviourists, Skinner brought his revolutionary regime of “operant conditioning” or “reinforcement”, his famous use of “negative and positive” reinforcements, to change a subject’s behaviours. This comes into play with Big Tech and their development of apps and devices (Fitbit, Apple Watch, etc.), along with their massive computing power; public and private CCTV cameras, and facial and voice recognition software. The ability to gather, collate and then operationalize modifications to users’ behaviour is the ‘ELDORADO’ of surveillance capitalism: the manipulation of individuals and groups in real time, all for commercial gain.  RESEARCHERS FROM the University of Texas and the University of Central Florida, in a study of thirteen such applications concluded that the monitoring devices “contain a wide range of behaviour change techniques typically used in clinical behaviour interventions.” (297) We are fast becoming a nation of mice trapped in a maze, manipulated into searching for that tasty bit of cheese.

 

3. INTERESTING SIDE NOTE: In the early 1970’s Zuboff, as a graduate student at Harvard University, took classes with Skinner. Then, as now, she stridently opposed his theories on human psychology and his views on organized society. 

 

4. "WHERE'S THE GOAT?! Think about the level of complacency this little critter was conditioned to feel. Eating that tasty grass and those green plants is all well and good, but it was  probably a good time for a HEADS UP! moment.

 

 

--------Zuboff provides her definition of “Surveillance Capitalism”, from the preface to her book:

 “1. A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; 2. A parasitic economic logic in which the production of goods and services is subordinated to a new global architecture of behavioural modification; 3. A rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history; 4. The foundational framework of a surveillance economy; 5. As significant a threat to human nature in the twenty-first century as industrial capitalism was to the natural world in the nineteenth and twentieth; 6. The origin of a new instrumentarian power that asserts dominance over society and presents startling challenges to market democracy; 7. A movement that aims to impose a new collective order based on total certainty 8.  An expropriation of critical human rights that is best understood as a coup from above: an overthrow of the people’s sovereignty.”

 

 

Zuboff, Shoshana. The Age of Surveillance Capitalism: The Fight For A Human Future At the New Frontier of Power. New York: Hachette Book Group, 2019. Print.

 

---“Big other: “Surveillance capitalism and the prospects of an information civilization”: 75-89. Web: Journal of Information Technology: Research, (2015) 30.  https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2594754  

 

 

FREE JULIAN ASSANGE and STEVEN DONZIGER

 

 

 

 


No comments: