The Week in Tech: How Google and Facebook Spawned Surveillance Capitalism

By Natasha Singer

Jan. 18, 2018

Greetings, I’m Natasha Singer, your resident privacy reporter. And I’m writing to you from wintry New York City as the government shutdown increases financial pressure on federal workers and the tech elites jet off to Davos, Switzerland, to hobnob at the World Economic Forum.

For the last few years, the forum has been heralding the “Fourth Industrial Revolution.” That’s the idea that today’s digital innovations are generating entire new industries — in much the way electricity enabled the mass production of the Model T Ford in the early 20th century.

But a provocative new book, “The Age of Surveillance Capitalism,” by Shoshana Zuboff, a professor emerita at the Harvard Business School, offers a more sobering counternarrative.

Published on Tuesday, the book argues that digital services developed by the likes of Google and Facebook should not be viewed as the latest iteration of industrialization. Instead, Dr. Zuboff writes, they represent a new and problematic market form that trades in predicting and influencing human behavior.

“Surveillance capitalism has taken human experience, specifically private human experience, and unilaterally claimed it as something to be bought and sold in the marketplace,” Dr. Zuboff told me during a visit to The Times’s office. “This new kind of marketplace trades in behavioral futures. It’s like a form of derivative. But it’s about us.”You have 4 free articles remaining.Subscribe to The Times

Yet most of us are not aware that platforms like Google and Facebook may track and analyze our every search, location, like, video, photo, post and punctuation mark the better to try to sway us, she said.

In fact, a new study on Facebook from the Pew Research Centerillustrates how opaque this behavior marketplace can be to consumers.

The study, my colleague Sapna Maheswari writes, reported that about three-fourths of Facebook users were unaware that the social network maintained lists of their personal interests, such as their political leanings, for advertisers. And about half of users who looked at their “ad preferences” — the Facebook pages displaying these details — said they were uncomfortable with the company’s creating lists of categories about them.

The technologies that power the behavior speculation market, of course, have spread far beyond online ads.

They enable auto insurers to surveil drivers and offer discounts based on their driving performance. They allow workplace wellness programs to charge higher health insurance premiums to employees who decline to wear fitness trackers. They helped Kremlin-linked groups mount political influence campaigns on Facebook (although, as my colleague John Herrman pointed out this past week, we have yet to learn how effective those campaigns were).

The flash-trading in human behavioral data was not inevitable.

In her book, Dr. Zuboff describes how Google, in its early days, used the keywords that people typed in to improve its search engine even as it paid scant attention to the collateral data — like users’ keyword phrasing, click patterns and spellings — that came with it. Pretty soon, however, Google began harvesting this surplus information, along with other details like users’ web-browsing activities, to infer their interests and target them with ads.

The model was later adopted by Facebook.

The companies’ pivot — from serving to surveilling their users — pushed Google and Facebook to harvest more and more data, Dr. Zuboff writes. In doing so, the companies sometimes bypassed privacy settings or made it difficult for users to opt out of data-sharing.

“We saw these digital services were free, and we thought, you know, ‘We’re making a reasonable trade-off with giving them valuable data,’” Dr. Zuboff told me. “But now that’s reversed. They’ve decided that we’re free, that they can take our experience for free and translate it into behavioral data. And so we are just the source of raw material.”

Of course, tech companies tend to bristle at the word “surveillance.” They associate it with government spying on individuals — not with their own snooping on users and trying to sway them at scale.

“When organizations do surveillance, people don’t have control over that,” Mark Zuckerberg, Facebook’s chief, said in April during a Senate hearingon Cambridge Analytica, the voter-profiling company that improperly harvested the data of millions of Facebook users. “But on Facebook, everything that you share, you have control over.”

Surveillance, however, simply means observation or supervision, often with the intent of channeling the surveilled in a particular direction. As Dr. Zuboff’s book points out, that is at the core of Facebook’s panopticon of a business model.

Natasha Singer covers data privacy and tech accountability for The New York Times. She also teaches a tech ethics course at the School of The New York Times, The Times’s pre-college program. Follow her on Twitter: @natashanyt.

Share this post:

Leave a Reply

Your email address will not be published. Required fields are marked *