Zucked — Roger McNamee

Amazon link

This book is the story of why I became convinced, in spite of myself, that even though Facebook provided a compelling experience for most of its users, it was terrible for America and needed to change or be changed, and what I have tried to do about it.

Zucked is an interesting perspective and narrative on Facebook (especially recent events, lol #privacy) from an ex-Facebook mentor, investor, user, and fan with serious Silicon Valley chops and good writing skills. The middle portion of the book, though long-winded, details a blow-by-blow account of the author’s attempt (along with others, including Tristan Harris, of Center for Humane Technology fame) to raise awareness of various issues to Facebook itself (which failed), as well as various stakeholders, particularly the government. The results of this work has yet to be seen, but Sen. Warner’s proposal for tech regulations may be the start of a movement in — what McNamee convinces us is — the right direction.

The internet platforms have harvested [Attention Merchants?] fifty years of trust and goodwill built up by their predecessors. They have taken advantage of that trust to surveil our every action online, to monetize personal data. In the process they have fostered hate speech, conspiracy theories, and disinformation, and enabled interference in elections. They have artificially inflated their profits by shirking civic responsibility. The platforms have damaged public health, undermined democracy, violated user privacy, and, in the case of Facebook and Google, gained monopoly power, all in the name of profits.

MacNamee’s account of his first meeting with Zuck is fascinating (if true), and included in the highlights (Location 231).

How we got here

In addition to harvesting said good-will, Facebook (and other startups today) has benefited from an increasingly open source stack and the proliferation of the cloud, which has reduced the cost of developing world-class products. Meanwhile, lean/agile philosophies have reduced time to market. And a move towards laissez-faire economics and libertarianism have helped too.

MacNamee also details a whole chapter to Stanford’s Fogg’s Persuasive Technology:

Fogg’s textbook lays out a formula for persuasion that clever programmers can exploit more effectively on each new generation of technology to hijack users’ minds. Prior to smartphones like the iPhone and Android, the danger was limited. After the transition to smartphones, users did not stand a chance. Fogg did not help. As described in his textbook, Fogg taught ethics by having students “work in small teams to develop a conceptual design for an ethically questionable persuasive technology—the more unethical the better.” He thought this was the best way to get students to think about the consequences of their work.

Tristan makes the case that platforms compete in a race to the bottom of the brain stem—where the AIs present content that appeals to the low-level emotions of the lizard brain, things like immediate rewards, outrage, and fear.

which pairs well with this quote on A/B testing to death:

The company [Facebook] tests every pixel to ensure it produces the desired response. Which shade of red best leads people to check their notifications? For how many milliseconds should notifications bubbles appear in the bottom left before fading away, to most effectively keep users on site? Based on what measures of closeness should we recommend new friends for you to “add”? When you have more than two billion users, you can test every possible configuration.

On anti-trust

Google, Facebook, and others also broke the mold by adopting advertising business models, which meant their products were free to use, eliminating another form of friction and protecting them from antitrust regulation.

The US economy has historically depended on startups far more than other economies, especially in technology. If my hypothesis is correct, the country has begun an experiment in depending on monopolists for innovation, economic growth, and job creation. If I consider Google, Amazon, and Facebook purely in investment terms, I cannot help but be impressed by the brilliant way they have executed their business plans. The problem is unintended consequences, which are more numerous and severe than I imagined. These companies do not need to choke off startup activities to be successful, but they cannot help themselves. That is what monopolists do. Does the country want to take that risk?

On culture and the Zuck-Sandberg cult

“[Zuck] had an advantage not available to earlier generations of entrepreneurs: he could build a team of people his age—many of whom had never before had a full-time job—and mold them. This allowed Facebook to accomplish things that had never been done before.”

The Verge published a story by Casey Newton about Tavis McGinn, who had recently left Facebook after a six-month stint as the personal pollster for Zuck and Sheryl. The story shocked us. Why would Facebook—which employs a small army to survey users on every issue imaginable—need to hire a new person for the sole purpose of polling the popularity of its two top executives? More remarkable was the timing: Tavis had been at Facebook from April through September 2017. They had hired the pollster while they were still denying any involvement in the Russian interference.

On December 11, 2017, The Verge reported that Chamath Palihapitiya, Facebook’s former vice president of growth, had given a speech at Stanford the month before in which he had expressed regrets about the negative consequences of Facebook’s success. “I think we have created tools that are ripping apart the social fabric of how society works…” [his] remarks echoed those of Sean Parker, the first president of Facebook, who in November had expressed  regret about the “social-validation feedback loop” inside the social network… Facebook had ignored Parker, but apparently they jumped on Palihapitiya. Within seventy-two hours of The Verge’s initial report, Palihapitiya publicly reversed course… He subsequently appeared on Christiane Amanpour’s show on CNN International and made it clear he thought Mark Zuckerberg was the smartest person he had ever met and suggested that Zuck was uniquely qualified to figure it out and save us all….  Why was [his] criticism more problematic for Facebook than that of Sean Parker or any of the earlier critics? There was one obvious difference. Before [he] left Facebook in 2011, he had recruited many of the leaders of the Growth team… If Chamath had continued to question Facebook’s mission, it is quite possible that the people he hired at the company, and those who knew him, might begin to question their leaders’ and company’s choices. The result might be a Susan Fowler Moment, named for the Uber engineer whose blog post about that company’s toxic culture led to an employee revolt and, ultimately, the departure of the executive team.

Some of his comments on Google are particularly insightful

Google made a list of the most important things people did on the web, including searches, browsing, and email. In those days, most users were forced to employ a mix of open source and proprietary tools from a range of vendors. Most of the products did not work together particularly well, creating a friction Google could exploit. Beginning with Gmail in 2004, Google created or acquired compelling products in maps, photos, videos, and productivity applications. Everything was free, so there were no barriers to customer adoption. Everything worked together. Every app gathered data that Google could exploit. Customers loved the Google apps. Collectively, the Google family of apps replaced a huge portion of the open World Wide Web. It was as though Google had unilaterally put a fence around half of a public park and then started commercializing it.

Thanks to its search engine, cloud services, and venture capital operation, Google has an exceptionally good view of emerging products.

Google realized that its data set of purchase intent would have greater value if it could be tied to customer identity. I call this McNamee’s 7th Law: data sets become geometrically more valuable when you combine them. That is where Gmail changed the game. Users got value in the form of a good email system, but Google received something far more valuable. By tying purchase intent to identity, Google laid the foundation for new business opportunities. It then created Google Maps, enabling it to tie location to purchase intent and identity. The integrated data set rivaled Amazon’s, but without warehouses and inventory it generated much greater profits for Google. Best of all, combined data sets often reveal insights and business opportunities that could not have been imagined previously. The new products were free to use, but each one contributed data that transformed the value of Google’s advertising products.

And his perspective / quantification of the election interference story

I concluded with an observation: the Russians might have used Facebook and other internet platforms to undermine democracy and influence a presidential election for roughly one hundred million dollars, or less than the price of a single F-35 fighter.

The night before the first hearing, Facebook disclosed that 126 million users had been exposed to Russian interference, as well as 20 million users on Instagram.

The fact that four million people who voted for Obama in 2012 did not vote for Clinton in 2016 may reflect to some degree the effectiveness of the Russian interference. How many of those stayed away because of Russian disinformation about Clinton’s email server, the Clinton Foundation, Pizzagate, and other issues?

In an election where only 137 million people voted, a campaign that targeted 126 million eligible voters almost certainly had an impact. 

And Facebook as a scary

If you are a fan of democracy, as I am, this should scare you. Facebook has become a powerful source of news in most democratic countries. To a remarkable degree it has made itself the public square in which countries share ideas, form opinions, and debate issues outside the voting booth. But Facebook is more than just a forum. It is a profit-maximizing business controlled by one person. It is a massive artificial intelligence that influences every aspect of user activity, whether political or otherwise. Even the smallest decisions at Facebook reverberate through the public square the company has created with implications for every person it touches. The fact that users are not conscious of Facebook’s influence magnifies the effect. If Facebook favors inflammatory campaigns, democracy suffers.

In Myanmar, Free Basics transformed internet access by making it available to the masses. That is also the case in most of the other countries that have adopted the service. The citizens of these countries are not used to getting information from media. Prior to Free Basics, they had little, if any, exposure to journalism and no preparation for social media. Their citizens did not have filters for the kind of disinformation shared on internet platforms. An idea that sounded worthy to people in the US, Free Basics has been more dangerous than I suspect its creators would have imagined. In Myanmar, a change in government policy caused an explosion in wireless usage, making Facebook the most important communications platform in the country. When allies of the ruling party used Facebook to promote violence against the Rohingya minority, the company fell back on its usual strategy of an apology and a promise to do better.

The analyst in me could not help but notice that the story of Facebook’s role in the 2016 election had unfolded with one consistent pattern: Facebook would first deny, then delay, then deflect, then dissemble. Only when the truth was unavoidable did Facebook admit to its role and apologize.

Zeynep Tufekci, a brilliant scholar from the University of North Carolina, framed Facebook’s history as a “fourteen-year apology tour.” I reflected that it might be time to tweak Facebook’s corporate motto: Move fast, break things, apologize, repeat.

Public pressure produced more concessions from Facebook, which announced additional policy and product changes in an attempt to appear cooperative and preempt regulatory action. As usual, the announcements featured sleight of hand. First, Facebook banned data brokers. While this sounded like a move that might prevent future Cambridge Analyticas, what it actually did was move Facebook closer to a data monopoly on its platform. Advertisers acquire data from brokers in order to improve ad targeting. By banning data brokers, Facebook forced advertisers to depend entirely on Facebook’s own data.

When the AI behavioral-prediction engines of Facebook and Google reach maturity, they may be able to abandon the endless accumulation of content data and some forms of metadata—addressing a meaningful subset of the privacy concerns that have been raised in the press and in congressional hearings—without actually improving users’ privacy.

Notebook Export
Zucked
Roger McNamee

Prologue
Highlight(pink) - Location 74
This book is the story of why I became convinced, in spite of myself, that even though Facebook provided a compelling experience for most of its users, it was terrible for America and needed to change or be changed, and what I have tried to do about it. My hope is that the narrative of my own conversion experience will help others understand the threat. Along the way, I will share what I know about the technology that enables internet platforms like Facebook to manipulate attention. I will explain how bad actors exploit the design of Facebook and other platforms to harm and even kill innocent people. How democracy has been undermined because of design choices and business decisions by internet platforms that deny responsibility for the consequences of their actions. How the culture of these companies causes employees to be indifferent to the negative side effects of their success. At this writing, there is nothing to prevent more of the same.
Highlight(pink) - Location 111
Zuck created Facebook to bring the world together. What I did not know when I met him but would eventually discover was that his idealism was unbuffered by realism or empathy. He seems to have assumed that everyone would view and use Facebook the way he did, not imagining how easily the platform could be exploited to cause harm. He did not believe in data privacy and did everything he could to maximize disclosure and sharing. He operated the company as if every problem could be solved with more or better code. He embraced invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence. Surveillance, the sharing of user data, and behavioral modification are the foundation of Facebook’s success. Users are fuel for Facebook’s growth and, in some cases, the victims of it.
Highlight(yellow) - Location 175
The stunning outcome of Brexit triggered a hypothesis: in an election context, Facebook may confer advantages to campaign messages based on fear or anger over those based on neutral or positive emotions. It does this because Facebook’s advertising business model depends on engagement, which can best be triggered through appeals to our most basic emotions. What I did not know at the time is that while joy also works, which is why puppy and cat videos and photos of babies are so popular, not everyone reacts the same way to happy content. Some people get jealous, for example. “Lizard brain” emotions such as fear and anger produce a more uniform reaction and are more viral in a mass audience. When users are riled up, they consume and share more content. Dispassionate users have relatively little value to Facebook, which does everything in its power to activate the lizard brain. Facebook has used surveillance to build giant profiles on every user and provides each user with a customized Truman Show, similar to the Jim Carrey film about a person who lives his entire life as the star of his own television show. It starts out giving users “what they want,” but the algorithms are trained to nudge user attention in directions that Facebook wants. The algorithms choose posts calculated to press emotional buttons because scaring users or pissing them off increases time on site. When users pay attention, Facebook calls it engagement, but the goal is behavior modification that makes advertising more valuable. I wish I had understood this in 2016. At this writing, Facebook is the fourth most valuable company in America, despite being only fifteen years old, and its value stems from its mastery of surveillance and behavioral modification.
Highlight(pink) - Location 198
If you are a fan of democracy, as I am, this should scare you. Facebook has become a powerful source of news in most democratic countries. To a remarkable degree it has made itself the public square in which countries share ideas, form opinions, and debate issues outside the voting booth. But Facebook is more than just a forum. It is a profit-maximizing business controlled by one person. It is a massive artificial intelligence that influences every aspect of user activity, whether political or otherwise. Even the smallest decisions at Facebook reverberate through the public square the company has created with implications for every person it touches. The fact that users are not conscious of Facebook’s influence magnifies the effect. If Facebook favors inflammatory campaigns, democracy suffers.
1. The Strangest Meeting Ever
Highlight(orange) - Location 231
I should probably tell the story of how I intersected with Facebook in the first place. In the middle of 2006, Facebook’s chief privacy officer, Chris Kelly, sent me an email stating that his boss was facing an existential crisis and required advice from an unbiased person. Would I be willing to meet with Mark Zuckerberg? Facebook was two years old, Zuck was twenty-two, and I was fifty. The platform was limited to college students, graduates with an alumni email address, and high school students. News Feed, the heart of Facebook’s user experience, was not yet available. The company had only nine million dollars in revenue in the prior year. But Facebook had huge potential—that was already obvious—and I leapt at the opportunity to meet its founder. Zuck showed up at my Elevation Partners office on Sand Hill Road in Menlo Park, California, dressed casually, with a messenger bag over his shoulder. U2 singer Bono and I had formed Elevation in 2004, along with former Apple CFO Fred Anderson, former Electronic Arts president John Riccitiello, and two career investors, Bret Pearlman and Marc Bodnick. We had configured one of our conference rooms as a living room, complete with a large arcade video game system, and that is where Zuck and I met. We closed the door and sat down on comfy chairs about three feet apart. No one else was in the room. Since this was our first meeting, I wanted to say something before Zuck told me about the existential crisis. “If it has not already happened, Mark, either Microsoft or Yahoo is going to offer one billion dollars for Facebook. Your parents, your board of directors, your management team, and your employees are going to tell you to take the offer. They will tell you that with your share of the proceeds—six hundred and fifty million dollars—you will be able to change the world. Your lead venture investor will promise to back your next company so that you can do it again. “It’s your company, but I don’t think you should sell. A big company will screw up Facebook. I believe you are building the most important company since Google and that before long you will be bigger than Google is today. You have two huge advantages over previous social media platforms: you insist on real identity and give consumers control over their privacy settings. “In the long run, I believe Facebook will be far more valuable to parents and grandparents than to college students and recent grads. People who don’t have much time will love Facebook, especially when families have the opportunity to share photos of kids and grandkids. “Your board of directors, management team, and employees signed up for your vision. If you still believe in your vision, you need to keep Facebook independent. Everyone will eventually be glad you did.” This little speech took about two minutes to deliver. What followed was the longest silence I have ever endured in a one-on-one meeting. It probably lasted four or five minutes, but it seemed like forever. Zuck was lost in thought, pantomiming a range of Thinker poses. I have never seen anything like it before or since. It was painful. I felt my fingers involuntarily digging into the upholstered arms of my chair, knuckles white, tension rising to a boiling point. At the three-minute mark, I was ready to scream. Zuck paid me no mind. I imagined thought bubbles over his head, with reams of text rolling past. How long would he go on like this? He was obviously trying to decide if he could trust me. How long would it take? How long could I sit there? Eventually, Zuck relaxed and looked at me. He said, “You won’t believe this.” I replied, “Try me.” “One of the two companies you mentioned wants to buy Facebook for one billion dollars. Pretty much everyone has reacted the way you predicted. They think I should take the deal. How did you know?” “I didn’t know. But after twenty-four years, I know how Silicon Valley works. I know your lead venture investor. I know Yahoo and Microsoft. This is how things go around here.” I continued, “Do you want to sell the company?” He replied, “I don’t want to disappoint everyone.” “I understand, but that is not the issue. Everyone signed up to follow your vision for Facebook. If you believe in your vision, you need to keep Facebook independent. Yahoo and Microsoft will wreck it. They won’t mean to, but that is what will happen. What do you want to do?” “I want to stay independent.” I asked Zuck to explain Facebook’s shareholder voting rules. It turned out he had a “golden vote,” which meant that the company would always do whatever he decided. It took only a couple of minutes to figure that out. The entire meeting took no more than half an hour. Zuck left my office and soon thereafter told Yahoo that Facebook was not for sale. There would be other offers for Facebook, including a second offer from Yahoo, and he would turn them down, too.
Highlight(orange) - Location 388
My first job after business school was at T. Rowe Price, in Baltimore, Maryland. It was a lot closer to Philadelphia than Hanover, but still too far to commute every day. That’s when I got hit by two game-changing pieces of good luck: my start date and my coverage group. My career began on the first day of the bull market of 1982, and they asked me to analyze technology stocks. In those days, there were no tech-only funds. T. Rowe Price was the leader in the emerging growth category of mutual funds, which meant they focused on technology more than anyone. I might not be able to make the first personal organizer, I reasoned, but I would be able to invest in it when it came along. In investing, they say that timing is everything. By assigning me to cover tech on the first day of an epic bull market, T. Rowe Price basically put me in a position where I had a tailwind for my entire career. I can’t be certain that every good thing in my career resulted from that starting condition, but I can’t rule it out either. It was a bull market, so most stocks were going up. In the early days, I just had to produce reports that gave the portfolio managers confidence in my judgment. I did not have a standard pedigree for an analyst, so I decided to see if I could adapt the job to leverage my strengths.
Highlight(orange) - Location 405
The personal computer business started to take off in 1985, and I noticed two things: everyone was my age, and they convened at least monthly in a different city for a conference or trade show. I persuaded my boss to let me join the caravan. Almost immediately I had a stroke of good luck. I was at a conference in Florida when I noticed two guys unloading guitars and amps from the back of a Ford Taurus. Since all guests at the hotel were part of the conference, I asked if there was a jam session I could join. There was. It turns out that the leaders of the PC industry didn’t go out to bars. They rented instruments and played music. When I got to my first jam session, I discovered I had an indispensable skill. Thanks to many years of gigs in bands and bars, I knew a couple hundred songs from beginning to end. No one else knew more than a handful. This really mattered because the other players included the CEO of a major software company, the head of R&D from Apple, and several other industry big shots. Microsoft cofounder Paul Allen played with us from time to time, but only on songs written by Jimi Hendrix. He could shred. Suddenly, I was part of the industry’s social fabric. It is hard to imagine this happening in any other industry, but I was carving my own path.
Note - Location 414
!
Highlight(orange) - Location 428
Another piece of amazing luck hit me when T. Rowe Price decided to create a growth-stage venture fund. I was already paying attention to private companies, because in those days, the competition in tech came from startups, not established companies. Over the next few years, I led three key growth-stage venture investments: Electronic Arts, Sybase, and Radius. The lead venture investor in all three companies was Kleiner Perkins Caufield & Byers, one of the leading venture capital firms in Silicon Valley. All three went public relatively quickly, making me popular both at T. Rowe Price and Kleiner Perkins. My primary contact at Kleiner Perkins was a young venture capitalist named John Doerr, whose biggest successes to that point had been Sun Microsystems, Compaq Computer, and Lotus Development. Later, John would be the lead investor in Netscape, Amazon, and Google.
Highlight(orange) - Location 435
From its launch through the middle of 1991, a period that included the 1987 crash and a second mini-crash in the summer of 1990, the fund achieved a 17 percent per annum return, against 9 percent for the S&P 500 and 6 percent for the technology index. That was when I left T. Rowe Price with John Powell to launch Integral Capital Partners, the first institutional fund to combine public market investments with growth-stage venture capital. We created the fund in partnership with Kleiner Perkins—with John Doerr as our venture capitalist—and Morgan Stanley. Our investors were the people who know us best, the founders and executives of the leading tech companies of that era.
Highlight(orange) - Location 444
In 1997, Martha Stewart came in with her home-decorating business, which, thanks to an investment by Kleiner Perkins, soon went public as an internet stock, which seemed insane to me. I was convinced that a mania had begun for dot-coms, embodied in the Pets.com sock puppet and the slapping of a little “e” on the front of a company’s name or a “.com” at the end. I knew that when the bubble burst, there would be a crash that would kill Integral if we did not do something radical. I took my concerns to our other partner, Morgan Stanley, and they gave me some money to figure out the Next Big Thing in tech investing, a fund that could survive a bear market. It took two years, but Integral launched Silver Lake Partners, the first private equity fund focused on technology. Our investors shared our concerns and committed one billion dollars to the new fund.
Highlight(orange) - Location 470
I had successful surgery in early July 2001, but my recovery was very slow. It took me nearly a year to recover fully. During that time, Apple shipped the first iPod. I thought it was a sign of good things to come and reached out to Steve Jobs to see if he would be interested in recapitalizing Apple. At the time, Apple’s share price was about twelve dollars per share, which, thanks to stock splits, is equivalent to a bit more than one dollar per share today. The company had more than twelve dollars in cash per share, which meant investors were attributing zero value to Apple’s business. Most of the management options had been issued at forty dollars per share, so they were effectively worthless. If Silver Lake did a recapitalization, we could reset the options and align interests between management and shareholders. Apple had lost most of its market share in PCs, but thanks to the iPod and iMac computers, Apple had an opportunity to reinvent itself in the consumer market. The risk/reward of investing struck me as especially favorable. We had several conversations before Steve told me he had a better idea. He wanted me to buy up to 18 percent of Apple shares in the public market and take a board seat. After a detailed analysis, I proposed an investment to my partners in the early fall of 2002, but they rejected it out of hand. The decision would cost Silver Lake’s investors the opportunity to earn more than one hundred billion dollars in profits.
2. Silicon Valley Before Facebook
Highlight(yellow) - Location 516
IBM was the dominant player in the mainframe era and made all the components for the machines it sold, as well as most of the software. That business model was called vertical integration. The era of government lasted about thirty years.
Highlight(yellow) - Location 523
Beginning in the seventies, the focus of the tech industry began to shift toward the needs of business. The era began with a concept called time sharing, which enabled many users to share the use of a single computer, reducing the cost to everyone. Time sharing gave rise to minicomputers, which were smaller than mainframes but still staggeringly expensive by today’s standards.
Highlight(yellow) - Location 545
In 1979, Dan Bricklin and Bob Frankston introduced VisiCalc, the first spreadsheet for personal computers. It is hard to overstate the significance of VisiCalc. It was an engineering marvel. A work of art. Spreadsheets on Apple IIs transformed the productivity
Highlight(yellow) - Location 553
The first IBM PC shipped in 1981, signaling a fundamental change in the tech industry that only became obvious a couple of years later, when Microsoft’s and Intel’s other customers started to compete with IBM. Eventually, Compaq, Hewlett-Packard, Dell, and others left IBM in the dust. In the long run, though, most of the profits in the PC industry went to Microsoft and Intel, whose control of the brains and heart of the device and willingness to cooperate forced the rest of the industry into a commodity business.
Highlight(yellow) - Location 572
By the mid-nineties, the wireless network evolved to a point that enabled widespread adoption of cell phones and alphanumeric pagers. The big applications were phone calls and email, then text messaging. The consumer era had begun. The business era had lasted nearly twenty years—from 1975 to 1995—but no business complained when it ended. Technology aimed at consumers was cheaper and somewhat easier to use, exactly what businesses preferred. It also rewarded a dimension that had not mattered to business: style. It took a few years for any vendor to get the formula right.
Highlight(yellow) - Location 600
In the early years of the new millennium, a game changing model challenged the page-centric architecture of the World Wide Web. Called Web 2.0, the new architecture revolved around people. The pioneers of Web 2.0 included people like Mark Pincus, who later founded Zynga; Reid Hoffman, the founder of LinkedIn; and Sean Parker, who had co-founded the music file sharing company Napster.
Highlight(yellow) - Location 603
After Napster, Parker launched a startup called Plaxo, which put address books in the cloud. It grew by spamming every name in every address book to generate new users, an idea that would be copied widely by social media platforms that launched thereafter.
Highlight(pink) - Location 605
In the same period, Google had a brilliant insight: it saw a way to take control of a huge slice of the open internet. No one owned open source tools, so there was no financial incentive to make them attractive for consumers. They were designed by engineers, for engineers, which could be frustrating to non-engineers. Google saw an opportunity to exploit the frustration of consumers and some business users. Google made a list of the most important things people did on the web, including searches, browsing, and email. In those days, most users were forced to employ a mix of open source and proprietary tools from a range of vendors. Most of the products did not work together particularly well, creating a friction Google could exploit. Beginning with Gmail in 2004, Google created or acquired compelling products in maps, photos, videos, and productivity applications. Everything was free, so there were no barriers to customer adoption. Everything worked together. Every app gathered data that Google could exploit. Customers loved the Google apps. Collectively, the Google family of apps replaced a huge portion of the open World Wide Web. It was as though Google had unilaterally put a fence around half of a public park and then started commercializing it.
Note - Location 613
!
Highlight(yellow) - Location 624
The first big Silicon Valley change related to the economics of startups. Hurdles that had long plagued new companies evaporated. Engineers could build world-class products quickly, thanks to the trove of complementary software components, like the Apache server and the Mozilla browser, from the open source community. With open source stacks as a foundation, engineers could focus all their effort on the valuable functionality of their app, rather than building infrastructure from the ground up. This saved time and money. In parallel, a new concept emerged—the cloud—and the industry embraced the notion of centralization of shared resources. The cloud is like Uber for data—customers don’t need to own their own data center or storage if a service provides it seamlessly from the cloud. Today’s leader in cloud services, Amazon Web Services (AWS), leveraged Amazon.com’s retail business to create a massive cloud infrastructure that it offered on a turnkey basis to startups and corporate customers. By enabling companies to outsource their hardware and network infrastructure, paying a monthly fee instead of the purchase price of an entire system, services like AWS lowered the cost of creating new businesses and shortened the time to market. Startups could mix and match free open source applications to create their software infrastructure. Updates were made once, in the cloud, and then downloaded by users, eliminating what had previously been a very costly and time-consuming process of upgrading individual PCs and servers. This freed startups to focus on their real value added, the application that sat on top of the stack. Netflix, Box, Dropbox, Slack, and many other businesses were built on this model.
Highlight(pink) - Location 647
Facebook’s motto—“Move fast and break things”—embodies the lean startup philosophy. Forget strategy. Pull together a few friends, make a product you like, and try it in the market. Make mistakes, fix them, repeat. For venture investors, the lean startup model was a godsend. It allowed venture capitalists to identify losers and kill them before they burned through much cash. Winners were so valuable that a fund needed only one to provide a great return.
Highlight(pink) - Location 673
To maximize both engagement and revenues, Web 2.0 startups focused their technology on the weakest elements of human psychology. They set out to create habits, evolved habits into addictions, and laid the groundwork for giant fortunes.
Highlight(pink) - Location 683
When rising oil prices triggered inflation and economic stagnation, the country transitioned into a new philosophical regime. The winner was libertarianism, which prioritized the individual over the collective good. It might be framed as “you are responsible only for yourself.” As the opposite of collectivism, libertarianism is a philosophy that can trace its roots to the frontier years of the American West. In the modern context, it is closely tied to the belief that markets are always the best way to allocate resources. Under libertarianism, no one needs to feel guilty about ambition or greed. Disruption can be a strategy, not just a consequence. You can imagine how attractive a philosophy that absolves practitioners of responsibility for the impact of their actions on others would be to entrepreneurs and investors in Silicon Valley. They embraced it. You could be a hacker, a rebel against authority, and people would reward you for it. Unstated was the leverage the philosophy conferred on those who started with advantages.
Highlight(pink) - Location 712
Under Reagan, the country also revised its view of corporate power. The Founding Fathers associated monopoly with monarchy and took steps to ensure that economic power would be widely distributed. There were ebbs and flows as the country adjusted to the industrial revolution, mechanization, technology, world wars, and globalization, but until 1981, the prevailing view was that there should be limits to the concentration of economic power and wealth. The Reagan Revolution embraced the notion that the concentration of economic power was not a problem so long as it did not lead to higher prices for consumers. Again, Silicon Valley profited from laissez-faire economics.
Highlight(pink) - Location 726
Google, Facebook, and others also broke the mold by adopting advertising business models, which meant their products were free to use, eliminating another form of friction and protecting them from antitrust regulation.
Highlight(pink) - Location 728
Their products enjoyed network effects, which occur when the value of a product increases as you add users to the network. Network effects were supposed to benefit users. In the cases of Facebook and Google, that was true for a time, but eventually the value increase shifted decisively to the benefit of owners of the network, creating insurmountable barriers to entry. Facebook and Google, as well as Amazon, quickly amassed economic power on a scale not seen since the days of Standard Oil one hundred years earlier.
Highlight(pink) - Location 732
In an essay on Medium, the venture capitalist James Currier pointed out that the key to success in the internet platform business is network effects and Facebook enjoyed more of them than any other company in history. He said, “To date, we’ve actually identified that Facebook has built no less than six of the thirteen known network effects to create defensibility and value, like a castle with six concentric layers of walls. Facebook’s walls grow higher all the time, and on top of them Facebook has fortified itself with all three of the other known defensibilities in the internet age: brand, scale, and embedding.”
Highlight(orange) - Location 761
San Francisco is hip, with diverse neighborhoods, decent public transportation, access to recreation, and lots of nightlife. It attracted a different kind of person than Sunnyvale or Mountain View, including two related types previously unseen in Silicon Valley: hipsters and bros. Hipsters had burst onto the public consciousness as if from a base in Brooklyn, New York, heavy on guys with beards, plaid shirts, and earrings. They seemed to be descendants of San Francisco’s bohemian past, a modern take on the Beats. The bros were different, though perhaps more in terms of style than substance. Ambitious, aggressive, and exceptionally self-confident, they embodied libertarian values. Symptoms included a lack of empathy or concern for consequences to others. The hipster and bro cultures were decidedly male.
Note - Location 767
Lol
Highlight(pink) - Location 783
Plus, as previously noted, he had an advantage not available to earlier generations of entrepreneurs: he could build a team of people his age—many of whom had never before had a full-time job—and mold them. This allowed Facebook to accomplish things that had never been done before.
Note - Location 785
Lack of experience OK... allowed him to shape culture
3. Move Fast and Break Things
Highlight(yellow) - Location 938
Facebook’s user count reached one hundred million in the third quarter of 2008. This was astonishing for a company that was only four and half years old, but Facebook was just getting started. Only seven months later, the user count hit two hundred million, aided by the launch of the Like button. The Like button soon defined the Facebook experience. “Getting Likes” became a social phenomenon. It gave users an incentive to spend more time on the site and joined photo tagging as a trigger for addiction to Facebook.
Highlight(pink) - Location 1014
Google realized that its data set of purchase intent would have greater value if it could be tied to customer identity. I call this McNamee’s 7th Law: data sets become geometrically more valuable when you combine them. That is where Gmail changed the game. Users got value in the form of a good email system, but Google received something far more valuable. By tying purchase intent to identity, Google laid the foundation for new business opportunities. It then created Google Maps, enabling it to tie location to purchase intent and identity. The integrated data set rivaled Amazon’s, but without warehouses and inventory it generated much greater profits for Google. Best of all, combined data sets often reveal insights and business opportunities that could not have been imagined previously. The new products were free to use, but each one contributed data that transformed the value of Google’s advertising products. Facebook did something analogous with each function it added to the platform. Photo tagging expanded the social graph. News Feed enriched it further. The Like button delivered data on emotional triggers. Connect tracked users as they went around the web. The value is not really in the photos and links posted by users. The real value resides in metadata—data about data—which is what we call the data that describes where the user was when he or she posted, what they were doing, with whom they were doing it, alternatives they considered, and more. Broadcast media like television, radio, and newspapers lack the real-time interactivity necessary to create valuable metadata. Thanks to metadata, Facebook and Google create a picture of the user that can be monetized more effectively than traditional media. When collected on the scale of Google and Facebook, metadata has unimaginable value.
Highlight(pink) - Location 1101
Back then, for a fan page likes ours, Facebook would let a really compelling post reach about 15 percent of our fans for free. The value of organic reach on Facebook compelled us and millions of others to shift the focus of our communications from a website to Facebook.
Highlight(pink) - Location 1105
Not surprisingly, there was a catch. Every year or so, Facebook would adjust the algorithm to reduce organic reach. The company made its money from advertising, and having convinced millions of organizations to set up shop on the platform, Facebook held all the cards. The biggest beneficiaries of organic reach had no choice but to buy ads to maintain their overall reach. They had invested too much time and had established too much brand equity on Facebook to abandon the platform. Organic reach declined in fits and starts until it finally bottomed at about 1 percent or less. Fortunately, Facebook would periodically introduce a new product—the Facebook Live video service, for example—and give those new products greater organic reach to persuade people like us to use them.
Highlight(yellow) - Location 1164
When The Guardian newspaper in the UK broke the story in December 2015 that Cambridge Analytica had misappropriated profiles from at least fifty million Facebook users, it precipitated an intense but brief scandal. Facebook apologized and made Cambridge Analytica sign a piece of paper, certifying that it had destroyed the data set, but then quickly returned to business as usual.
4. The Children of Fogg
Highlight(pink) - Location 1213
After graduation, Tristan enrolled in the graduate computer science master’s program at Stanford. In his first term, he took a class in persuasive technology with Professor B. J. Fogg, whose textbook, Persuasive Technology, is the standard in the field. Professors at other universities teach the subject, but being at Stanford gave Fogg outsized influence in Silicon Valley. His insight was that computing devices allow programmers to combine psychology and persuasion concepts from the early twentieth century, like propaganda, with techniques from slot machines, like variable rewards, and tie them to the human social need for approval and validation in ways that few users can resist. Like a magician doing a card trick, the computer designer can create the illusion of user control when it is the system that guides every action. Fogg’s textbook lays out a formula for persuasion that clever programmers can exploit more effectively on each new generation of technology to hijack users’ minds. Prior to smartphones like the iPhone and Android, the danger was limited. After the transition to smartphones, users did not stand a chance. Fogg did not help. As described in his textbook, Fogg taught ethics by having students “work in small teams to develop a conceptual design for an ethically questionable persuasive technology—the more unethical the better.” He thought this was the best way to get students to think about the consequences of their work.
Highlight(pink) - Location 1254
INTERNET PLATFORMS HAVE EMBRACED B. J. Fogg’s approach to persuasive technology, applying it in every way imaginable on their sites. Autoplay and endless feeds eliminate cues to stop. Unpredictable, variable rewards stimulate behavioral addiction. Tagging, Like buttons, and notifications trigger social validation loops. As users, we do not stand a chance. Humans have evolved a common set of responses to certain stimuli—“flight or fight” would be an example—that can be exploited by technology. When confronted with visual stimuli, such as vivid colors—red is a trigger color—or a vibration against the skin near our pocket that signals a possible enticing reward, the body responds in predictable ways: a faster heartbeat and the release of a neurotransmitter, dopamine. In human biology, a faster heartbeat and the release of dopamine are meant to be momentary responses that increase the odds of survival in a life-or-death situation. Too much of that kind of stimulus is a bad thing for any human, but the effects are particularly dangerous in children and adolescents. The first wave of consequences includes lower sleep quality, an increase in stress, anxiety, depression, an inability to concentrate, irritability, and insomnia. That is just the beginning. Many of us develop nomophobia, which is the fear of being separated from one’s phone. We are conditioned to check our phones constantly, craving ever more stimulation from our platforms of choice.
Highlight(pink) - Location 1278
Tristan makes the case that platforms compete in a race to the bottom of the brain stem—where the AIs present content that appeals to the low-level emotions of the lizard brain, things like immediate rewards, outrage, and fear.
Highlight(pink) - Location 1290
In 2014, Facebook published a study called “Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks,” where they manipulated the balance of positive and negative messages in the News Feeds of nearly seven hundred thousand users to measure the influence of social networks on mood. In its internal report, Facebook claimed the experiment provided evidence that emotions can spread over its platform. Without getting prior informed consent or providing any warning, Facebook made people sad just to see if it could be done. Confronted with a tsunami of criticism, Sheryl Sandberg said this: “This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated. And for that communication we apologize. We never meant to upset you.”
Highlight(yellow) - Location 1350
It is not for nothing that the industry jokes about YouTube’s “three degrees of Alex Jones,” referring to the notion that no matter where you start, YouTube’s algorithms will often surface a Jones conspiracy theory video within three recommendations. In an op-ed in Wired, my colleague Renée DiResta quoted YouTube chief product officer Neal Mohan as saying that 70 percent of the views on his platform are from recommendations. In the absence of a commitment to civic responsibility, the recommendation engine will be programmed to do the things that generate the most profit. Conspiracy theories cause users to spend more time on the site.
Highlight(yellow) - Location 1355
Once a person identifies with an extreme position on an internet platform, he or she will be subject to both filter bubbles and human nature. A steady flow of ideas that confirm beliefs will lead many users to make choices that exclude other ideas both online and off. As I learned from Clint Watts, a national security consultant for the FBI, the self-imposed blocking of ideas is called a preference bubble. Filter bubbles are imposed by others, while a preference bubble is a choice. By definition, a preference bubble takes users to a bad place, and they may not even be conscious of the change.
Bookmark - Location 1396
Highlight(pink) - Location 1407
The company tests every pixel to ensure it produces the desired response. Which shade of red best leads people to check their notifications? For how many milliseconds should notifications bubbles appear in the bottom left before fading away, to most effectively keep users on site? Based on what measures of closeness should we recommend new friends for you to “add”? When you have more than two billion users, you can test every possible configuration.
5. Mr. Harris and Mr. McNamee Go to Washington
Highlight(yellow) - Location 1630
Once you get past the buzzwords, tech is not particularly complicated in comparison to other industries that Congress regulates. Health care and banking are complex industries that Congress has been able to regulate effectively, despite the fact that relatively few policy makers have had as much involvement with them as they have with technology, which touches everyone, including members of Congress, on a daily basis.
Highlight(yellow) - Location 1634
Critics who charge that regulation is too blunt an instrument for an industry like tech are not wrong, but they miss the point. The goal of regulation is to change incentives. Industries that ignore political pressure for reform, as the internet platforms have, should expect ever more onerous regulatory initiatives until they cooperate. The best way for tech to avoid heavy regulation is for the industry leaders to embrace light regulation and make appropriate changes to their business practices.
Highlight(pink) - Location 1683
We also shared a hypothesis that the lack of data dumps from the DCCC hack meant the data might have been used in congressional campaigns instead. The WikiLeaks email dumps had all come from the DNC hack. The DCCC, by contrast, would have had data that might be used for social media targeting, but more significantly, Democratic Party data from every congressional district. The data would have been the equivalent of inside information about Democratic voters. Hypothetically, the DCCC data might have allowed the Russians—or potentially someone in the Republican Party—to see which Democrats in a district could be persuaded to stay home instead of voting. We learned later that during the final months of the 2016 campaign, the Russians had concentrated their spending in the states and congressional districts that actually tipped the election.
Highlight(pink) - Location 1698
I concluded with an observation: the Russians might have used Facebook and other internet platforms to undermine democracy and influence a presidential election for roughly one hundred million dollars, or less than the price of a single F-35 fighter. It was just an educated guess on my part, based on an estimate of what eighty to one hundred hackers might cost for three or four years, along with a really large Facebook ad budget. In reality, the campaign may have cost less, but given the outcome, one hundred million dollars would have been a bargain.
6. Congress Gets Serious
Highlight(yellow) - Location 1780
Renée explained that the typical path for disinformation or a conspiracy theory is to be incubated on sites like Reddit, 4chan, or 8chan. There are many such stories in play at any time, a handful of which attract enough support to go viral. For the Russians, any time a piece of disinformation gained traction, they would seed one or more websites with a document that appeared to be a legitimate news story about the topic. Then they would turn to Twitter, which has replaced the Associated Press as the news feed of record for journalists. The idea was to post the story simultaneously on an army of Twitter accounts, with a link to the fake news story. The Twitter army might consist of a mix of real accounts and bots. If no journalist picked up the story, the Twitter accounts would post new messages saying some variant of “read the story that the mainstream media doesn’t want you to know about.” Journalism is intensely competitive, with a twenty-four-hour news cycle that allows almost no time for reflection. Eventually some legitimate journalist may write about the story. Once that happens, the game really begins. The army of Twitter accounts—which includes a huge number of bots—tweets and retweets the legitimate story, amplifying the signal dramatically. Once a story is trending, other news outlets are almost certain to pick it up. At that point, it’s time to go for the mass market, which means Facebook. The Russians would have placed the story in the Facebook Groups they controlled, counting on Facebook’s filter bubbles to ensure widespread acceptance of the veracity of the story, as well as widespread sharing. Trolls and bots help, but the most successful disinformation and conspiracy theories leveraged American citizens who trusted the content they received from fellow members of Facebook Groups.
Bookmark - Location 1780
Highlight(orange) - Location 1840
Our meeting concluded with a request from the House Intelligence Committee staffers: Could we help them learn about Facebook and Twitter?
Highlight(pink) - Location 1879
The night before the first hearing, Facebook disclosed that 126 million users had been exposed to Russian interference, as well as 20 million users on Instagram.
Highlight(pink) - Location 1884
The fact that four million people who voted for Obama in 2012 did not vote for Clinton in 2016 may reflect to some degree the effectiveness of the Russian interference. How many of those stayed away because of Russian disinformation about Clinton’s email server, the Clinton Foundation, Pizzagate, and other issues?
Highlight(pink) - Location 1890
In an election where only 137 million people voted, a campaign that targeted 126 million eligible voters almost certainly had an impact. How would Facebook spin that?
7. The Facebook Way
Highlight(pink) - Location 1994
Thanks to its search engine, cloud services, and venture capital operation, Google has an exceptionally good view of emerging products.
Highlight(pink) - Location 2007
One of the competitors Facebook has reportedly tracked with Onavo is Snapchat. There is bad blood between the two companies that began after Snapchat rejected an acquisition offer from Facebook in 2013. Facebook started copying Snapchat’s key features in Instagram, undermining Snapchat’s competitive position. While Snapchat managed to go public and continues to operate as an independent company, the pressure from Facebook continues unchecked and has taken a toll. Under a traditional antitrust regime, Snapchat would almost certainly have a case against Facebook for anticompetitive behavior.
Highlight(pink) - Location 2074
It is the most centralized decision-making structure I have ever encountered in a large company, and it is possible only because the business itself is not complicated.
Highlight(yellow) - Location 2124
If Chamath had continued to question Facebook’s mission, it is quite possible that the people he hired at the company, and those who knew him, might begin to question their leaders’ and company’s choices. The result might be a Susan Fowler Moment, named for the Uber engineer whose blog post about that company’s toxic culture led to an employee revolt and, ultimately, the departure of the executive team.
8. Facebook Digs in Its Heels
Highlight(yellow) - Location 2164
Facebook finished 2017 as it had begun it, by not giving an inch, thus violating a central precept of crisis management: embracing criticism.
Highlight(yellow) - Location 2190
As an early adopter of electronic mail, Microsoft managed its global operations with the shortest lag times imaginable. Email enabled Microsoft employees in the remotest parts of Australia, South America, Africa, or Asia to escalate problems through the chain of command to the proper decision maker in Redmond in a matter of hours. It is hard to overstate the significance of the breakthrough represented by Microsoft’s email system and the competitive advantage it provided.
Highlight(pink) - Location 2235
What do I mean by human-driven technology? I want to see a return to technology that leverages the human intellect, consistent with Steve Jobs’s “bicycle for the mind” metaphor. Human-driven products do not prey on human weakness. They compensate for weakness in users and leverage strengths. This means taking steps to prevent addiction and, when those fail, to mitigate the downsides. The design of devices should deliver utility without dependence.
Highlight(yellow) - Location 2309
George may have started from my Washington Monthly essay, but the final speech went much further, tying the threat from internet platforms to geopolitics. By the time I left the Soros home, I had every hope that George’s speech would have an impact. It did. The Soros speech at Davos on January 25 reverberated through the halls of government in both Europe and the United States, reframing the conversation from the relatively narrow confines of the US presidential election to the much broader space of global economics and politics.
9. The Pollster
Highlight(pink) - Location 2333
The problem is that Facebook really is a media company. It exercises editorial judgment in many ways, including through its algorithms. Facebook’s position has always been that users choose their friends and which links to view, but in reality, Facebook selects and sequences content for each user’s News Feed, an editorial process that had led to criticism in the past, most notably when conservatives accused the company in May 2016 of bias in its Trending Stories feature.
Highlight(orange) - Location 2359
That same day, The Verge published a story by Casey Newton about Tavis McGinn, who had recently left Facebook after a six-month stint as the personal pollster for Zuck and Sheryl. The story shocked us. Why would Facebook—which employs a small army to survey users on every issue imaginable—need to hire a new person for the sole purpose of polling the popularity of its two top executives? More remarkable was the timing: Tavis had been at Facebook from April through September 2017. They had hired the pollster while they were still denying any involvement in the Russian interference.
Highlight(orange) - Location 2365
“I joined Facebook hoping to have an impact from the inside,” he says. “I thought, here’s this huge machine that has a tremendous influence on society, and there’s nothing I can do as an outsider. But if I join the company, and I’m regularly taking the pulse of Americans to Mark, maybe, just maybe that could change the way the company does business. I worked there for six months and I realized that even on the inside, I was not going to be able to change the way that the company does business. I couldn’t change the values. I couldn’t change the culture. I was probably far too optimistic. “Facebook is Mark, and Mark is Facebook,” McGinn says. “Mark has 60 percent voting rights for Facebook. So you have one individual, 33 years old, who has basically full control of the experience of 2 billion people around the world. That’s unprecedented. Even the president of the United States has checks and balances. At Facebook, it’s really this one person.”
10. Cambridge Analytica Changes Everything
Highlight(orange) - Location 2447
March 2018 brought almost daily revelations about unintended damage from social media. Science magazine published a study conducted by professors at MIT of every controversial story in English on Twitter. It revealed that disinformation and fake news are shared 70 percent more often than factual stories and spread roughly six times faster. The study noted that bots share facts and disinformation roughly equally, suggesting that it is humans who prefer to share falsehoods.
Highlight(yellow) - Location 2481
In Myanmar, Free Basics transformed internet access by making it available to the masses. That is also the case in most of the other countries that have adopted the service. The citizens of these countries are not used to getting information from media. Prior to Free Basics, they had little, if any, exposure to journalism and no preparation for social media. Their citizens did not have filters for the kind of disinformation shared on internet platforms. An idea that sounded worthy to people in the US, Free Basics has been more dangerous than I suspect its creators would have imagined. In Myanmar, a change in government policy caused an explosion in wireless usage, making Facebook the most important communications platform in the country. When allies of the ruling party used Facebook to promote violence against the Rohingya minority, the company fell back on its usual strategy of an apology and a promise to do better.
Highlight(yellow) - Location 2550
In the year prior to the IPO, Zynga alone accounted for twelve percent of Facebook’s revenue. Zynga’s ability to leverage friends lists contributed to an insight: giving third-party developers access to friends lists would be a huge positive for Facebook’s business. Social games like FarmVille cause people to spend much more time on Facebook. Users see a lot of ads. Zynga had a brilliant insight: adding a social component to its games would leverage Facebook’s architecture and generate far more revenue, creating an irresistible incentive for Facebook to cooperate. In 2010, Facebook introduced a tool that enabled third-party developers to harvest friends lists and data from users. They saw the upside of sharing friends lists. If they recognized the potential for harm, they did not act on it. Despite the 2011 consent decree with the FTC, the tool remained available for several more years.
Highlight(yellow) - Location 2672
The analyst in me could not help but notice that the story of Facebook’s role in the 2016 election had unfolded with one consistent pattern: Facebook would first deny, then delay, then deflect, then dissemble. Only when the truth was unavoidable did Facebook admit to its role and apologize.
Highlight(pink) - Location 2675
Zeynep Tufekci, a brilliant scholar from the University of North Carolina, framed Facebook’s history as a “fourteen-year apology tour.” I reflected that it might be time to tweak Facebook’s corporate motto: Move fast, break things, apologize, repeat.
11. Days of Reckoning
Highlight(pink) - Location 2780
As the newest generation in an industry with a long track record of good corporate behavior and products that made life better for customers, internet platforms inherited the benefits of fifty years of trust and goodwill. Today’s platforms emerged at a time when economic philosophy in the United States had embraced deregulation as foundational. The phrase “job-killing regulations” had developed superpowers in the political sphere, chilling debate and leading many to forget why regulations exist in the first place. No government or agency creates a regulation with the goal of killing jobs. They do it to protect employees, customers, the environment, or society in general. With an industry like tech, where corporate behavior had been relatively benign for generations, few policy makers imagined the possibility of a threat.
Highlight(pink) - Location 2880
Public pressure produced more concessions from Facebook, which announced additional policy and product changes in an attempt to appear cooperative and preempt regulatory action. As usual, the announcements featured sleight of hand. First, Facebook banned data brokers. While this sounded like a move that might prevent future Cambridge Analyticas, what it actually did was move Facebook closer to a data monopoly on its platform. Advertisers acquire data from brokers in order to improve ad targeting. By banning data brokers, Facebook forced advertisers to depend entirely on Facebook’s own data.
Highlight(yellow) - Location 2890
The hearings began on the afternoon of April 10 in the Senate. Zuck arrived in a suit, shook lots of hands, and settled in for five hours of questions. The combined committees have a total of forty-five members. Each senator would have only four minutes, which favored Zuck, who prepared well for the format. If he could be long-winded with each answer, Zuck might be able to limit each senator to only three or four questions. Perhaps more important, the most senior members of the committee went first, and they were not as well prepared as Zuck. Whether by luck or design, Facebook had agreed to appear on the first day after a two-week recess, minimizing the opportunity for staff members to prepare senators. The benefits of that timing to Facebook were immediately obvious. Several senators did not seem to understand how Facebook works. Senator Orrin Hatch asked, “How do you sustain a business model in which users don’t pay for your service?” revealing his ignorance about Facebook’s advertising business model. Armed with a cheat sheet of diplomatic answers, Zuck patiently ran out the clock on each senator. Senators attempted to grill Zuck, and in the second hour, a couple of senators asked pointed questions. For the most part, Zuck deflected. Zuck also benefited from a lack of coordination among the senators. It seemed that each senator addressed a different issue.
12. Success?
Highlight(yellow) - Location 3021
In a conversation with Representative Joe Kennedy days after the House hearing, Zuck indicated that Clear History would apply to metadata as well as links, which would represent a huge departure from recent practice and a genuine benefit to users. Skeptics point to a more ominous explanation. Facebook has been using its massive store of user data to train the behavioral-targeting engine of its artificial intelligence. In the early phases of training, the engine needs every piece of data Facebook can find, but eventually the training reaches a level where the engine can anticipate user behavior. Perhaps you have heard anecdotes about people saying a brand name out loud and then seeing an ad for that brand on Facebook. The users assume Facebook must be using their device’s microphone to eavesdrop on conversations. That is not practical today. A more likely explanation is that the behavioral-prediction engine has made a good forecast about a user desire, and the brand in question happens to be a Facebook advertiser. It is deeply creepy. It will get creepier as the technology improves. Once the behavioral-prediction engine can forecast consistently, it will no longer need the same amount of data that was required to create it. A smaller flow of data, much of it metadata, will get the job done. If that is where we are, letting users clear their browsing history on Facebook would provide the illusion of privacy, without changing Facebook’s business or protecting users.
Highlight(pink) - Location 3036
When the AI behavioral-prediction engines of Facebook and Google reach maturity, they may be able to abandon the endless accumulation of content data and some forms of metadata—addressing a meaningful subset of the privacy concerns that have been raised in the press and in congressional hearings—without actually improving users’ privacy.
Highlight(orange) - Location 3105
I had just written an op-ed for the Financial Times on the subject of the 1956 consent decree with AT&T, which ended that company’s first antitrust case, but it had not yet been published, so I gave Representative Lofgren a preview. The decree had two key elements: AT&T agreed to limit itself to its existing regulated markets, which meant the landline telephone business, and it agreed to license its patent portfolio at no cost. By limiting itself to regulated markets, AT&T would not enter the nascent computer industry, leaving that to IBM and others. This was a very big deal and was consistent with historical practice. AT&T owed its own existence to a prohibition on telegraph companies entering telephony. Allowing the computer industry to develop as its own category proved to be good policy in every possible way. Compulsory licensing of the AT&T patent portfolio turned out to be even more important. AT&T’s Bell Labs did huge amounts of research that led to a wide range of fundamental patents. Included among them was the transistor. By making the transistor available for license, the 1956 consent decree gave birth to Silicon Valley. All of it. Semiconductors. Computers. Software. Video games. The internet. Smartphones. Is there any way that the US economy would have been better off allowing AT&T to exploit the transistor on its own timeline? Does anyone think there is a chance AT&T would have done as good a job with that invention as the thousands of startups it spawned in Silicon Valley? Here’s the clincher: the 1956 consent decree did not prevent AT&T from being amazingly successful, so successful that it precipitated a second antitrust case. The company was ultimately broken up in 1984, a change that unleashed another tsunami of growth.
Highlight(orange) - Location 3120
Applying the logic of the 1956 AT&T consent decree to Google, Amazon, and Facebook would set limits to their market opportunity, creating room for new entrants. That might or might not require the divestiture of noncore operations. There is nothing in the patent portfolios of the platform giants that rivals the transistor, but there is no doubt in my mind that the giants use patents as a defensive moat to keep competitors at bay. Opening up those portfolios would almost certainly unleash tremendous innovation, as there are thousands of entrepreneurs who might jump at an opportunity to build on the patents.
Highlight(orange) - Location 3125
Harvard professor Jonathan Zittrain had written an op-ed in The New York Times that recommended extending to data-intensive companies the fiduciary rule that applies to professions that hold sensitive data about clients. As fiduciaries, doctors and lawyers must always place the needs of the client first, safeguarding privacy. If doctors and lawyers were held to the same standard as internet platforms, they would be able to sell access to your private information to anyone willing to pay. Extending the fiduciary rule to companies that hold consumer data—companies like Equifax and Acxiom, as well as internet platforms—would have two benefits. First, it would create a compelling incentive for companies to prioritize data privacy and security. Second, it would enable consumers (and businesses) harmed by data holders to have a legal remedy that cannot be unilaterally eliminated by companies in their terms of service. Today, the standard practice is to force users who feel they have been harmed to go into arbitration, a process that has historically favored companies over their customers. If consumers always had the option of litigation, companies would be less likely to act carelessly.
Highlight(orange) - Location 3181
Facebook told Motherboard that its AI tools detect almost all of the spam it removes from the site, along with 99.5 percent of terrorist-related content removals, 98.5 percent of fake account removals, 96 percent of nudity and sexual content removals, 86 percent of graphic violence removals, and 38 percent of hate speech removals. Those numbers sound impressive, but require context. First, these numbers merely illustrate AI’s contribution to the removal process. We still do not know how much inappropriate content escapes Facebook’s notice. With respect to AI’s impact, a success rate of 99.5 percent will still allow five million inappropriate posts per billion. For context, there were nearly five billion posts a day on Facebook . . . in 2013. AI’s track record on hate speech—accounting for 38 percent of removals—is not helpful at all. Human moderators struggle to find inappropriate content missed by AI, but they are forced to operate within the constraints of Facebook’s twin goals of maximum permissiveness and generalized solutions to all problems. The rule book for moderators is long and detailed, but also filled with conflicts and ambiguity. Moderators burn out very quickly.
Highlight(orange) - Location 3194
To get a sense of the impact, I asked Erin McKean, founder of Wordnik and former editor of the Oxford Dictionary of American English, to study changes in the nouns and adjectives mostly frequently associated with each of the largest tech companies: Apple, Google, Amazon, Facebook, and Microsoft, plus Twitter. Prior to the 2016 election, the tech leaders enjoyed pristine reputations, with no pejorative word associations. For Google, Amazon, Apple, and Microsoft, that is still true. For Facebook, things have changed dramatically. The word “scandal” now ranks in the top 50 nouns associated with Facebook. “Breach” and “investigation” are in the top 250 nouns. With adjectives the situation is even worse. Alone among the five tech leaders, Facebook had one pejorative adjective in its top 100 in 2015–2016: “controversial.” In 2017 and 2018, the adjective “fake” ranked in the top 10 for Facebook, followed by “Russian,” “alleged,” “critical,” “Russian-linked,” “false,” “leaked,” and “racist,” all of which ranked in the top 100 adjectives. Apple, Google, Amazon, and Microsoft do not have a single pejorative noun or adjective on their lists. Twitter has two nouns on its list that may or may not imply brand issues: “Trump” and “bots.” The study was conducted using the News on the Web (NOW) corpus at Brigham Young University. The top 10 US sources in the corpus, ranked by number of words, are Huffington Post, NPR, CNN, The Atlantic, TIME, Los Angeles Times, Wall Street Journal, Slate, USA Today, and ABC News.
Highlight(pink) - Location 3236
The design of Facebook trained users to unlock their emotions, to react without critical thought. On a small scale this would not normally be a problem, but at Facebook’s scale it enables emotional contagion, where emotions overwhelm reason. Emotional contagion is analogous to wildfire. It will spread until it runs out of fuel. Left unchecked, hate speech leads to violence, disinformation undermines democracy. When you connect billions of people, hate speech and disinformation are inevitable. If you operate a large public network, you have to anticipate wildfires of hate speech and disinformation. In the real world, firefighters combat wildfires with a strategy of containment. Similarly, financial markets limit panics with circuit breakers that halt trading long enough to ensure that prices reflect a balance between facts and emotion. Facebook grew to 2.2 billion monthly users without imagining the risk of emotional contagion, much less developing a strategy for containing it.
Bookmark - Location 3247
Highlight(pink) - Location 3247
The internet platforms have harvested fifty years of trust and goodwill built up by their predecessors. They have taken advantage of that trust to surveil our every action online, to monetize personal data. In the process they have fostered hate speech, conspiracy theories, and disinformation, and enabled interference in elections. They have artificially inflated their profits by shirking civic responsibility. The platforms have damaged public health, undermined democracy, violated user privacy, and, in the case of Facebook and Google, gained monopoly power, all in the name of profits.
13. The Future of Society
Highlight(yellow) - Location 3425
In terms of economic policy, I want to set limits on the markets in which monopoly-class players like Facebook, Google, and Amazon can operate. The economy would benefit from breaking them up. A first step would be to prevent acquisitions, as well as cross subsidies and data sharing among products within each platform. I favor regulation and believe it can address a portion of the threat posed by Google and Amazon. Unfortunately, relative to Facebook, there is no preexisting model of regulation to address the heart of the problem, which relates to the platform’s design and business model.
Highlight(yellow) - Location 3488
Campaigns can buy a list of two hundred million voting-age Americans with fifteen hundred data points per person from a legitimate data broker for seventy-five thousand dollars. Commercial users have to pay more, but not that much more. Think about that. Commercial data brokers do not sell lists that have been paired with voter files, so it would take some effort to replicate the data set created by Cambridge Analytica, but it can definitely be done by any sufficiently motivated party. A data set that includes Facebook user IDs gets access to the latest user data every time it is used inside Facebook.
Highlight(pink) - Location 3514
Google collects more data than anyone else. They get your data from search results, Gmail, Google Maps, YouTube, and every other app they offer. They acquire your credit card data, as well as data from other offline sources. They use artificial intelligence to create a filter bubble in your search results of things you like. They use their sea of data to crush competitors. Google’s most glaring problems are in YouTube, which offers a triple-header of harm: the Kids channel, the promotion of disinformation, and the recruiting/training of extremists. For whatever reason, Google has not been able to fix any of these problems. Were it not for Facebook, we would be having this conversation about Google and YouTube.
Highlight(pink) - Location 3608
If I were running Google, I would embrace GDPR like a religion. I would escape the lifeboat Google has been sharing with Facebook and create as much distance as possible. As the largest internet platform, Google would enjoy a huge relative advantage in terms of the cost of compliance. It would also radically improve its relative standing with regulators around the world. I believe that enthusiastic compliance with the spirit of GDPR would lead to an increase in user trust, which would almost certainly pay dividends in the future. I would spin off YouTube to shareholders, as that would create a more powerful incentive to reduce the threat from disinformation and extremism. Google does not seem to understand that the primary reason its users and policy makers are not up in arms is because Facebook is worse.
Highlight(pink) - Location 3649
The US economy has historically depended on startups far more than other economies, especially in technology. If my hypothesis is correct, the country has begun an experiment in depending on monopolists for innovation, economic growth, and job creation. If I consider Google, Amazon, and Facebook purely in investment terms, I cannot help but be impressed by the brilliant way they have executed their business plans. The problem is unintended consequences, which are more numerous and severe than I imagined. These companies do not need to choke off startup activities to be successful, but they cannot help themselves. That is what monopolists do. Does the country want to take that risk?
Highlight(pink) - Location 3661
I would like to think that Silicon Valley can earn a living without killing millions of jobs in other industries. In the mid-seventies and eighties, when the US first restructured its economy around information technology, tech enabled companies to eliminate layers of middle management, but the affected people were rapidly absorbed in more attractive sectors of the economy. That is no longer the case. The economy is creating part-time jobs with no benefits and no security—driving for Uber or Lyft, for example—but not creating jobs that support a middle-class lifestyle, in part because that has not been a priority. One opportunity for the government is to create tax incentives for tech businesses (and others) to retrain and create jobs for workers threatened by recent changes in the economy. Teaching everyone to code is not the answer, as coding will likely be an early target for automation through artificial intelligence.
Highlight(pink) - Location 3672
Users should always own all their own data and metadata. No one should be able to use a user’s data in any way without explicit, prior consent. Portability of users’ social graphs from any site will transfer power to users and promote competition. Third-party audits of algorithms, comparable to what exists now for financial statements, would create the transparency necessary to limit undesirable consequences. There should be limits on what kind of data can be collected, such that users can limit data collection or choose privacy. This needs to be done immediately, before the Internet of Things reaches mass adoption.
14. The Future of You
Highlight(yellow) - Location 3751
I avoid using Google wherever possible because of its data-collection policies. Avoiding Google is inconvenient, so I have turned it into a game. I use DuckDuckGo as my search engine because it does not collect search data. I use Signal for texting. I don’t use Gmail or Google Maps. I use a tracking blocker called Ghostery so that Google, Facebook, and others cannot follow me around the web. I am far from invisible on the web, but my shadow is smaller.
Bibliographic Essay
Highlight(orange) - Location 4326
There are several good books about the culture in Silicon Valley. A good place to start is Brotopia: Breaking Up the Boys’ Club of Silicon Valley, by Emily Chang (New York: Portfolio, 2018). Chang is the host of Bloomberg Technology, the show on which I interviewed Tristan in April 2017, after his appearance on 60 Minutes. (Emily was on maternity leave that day!) The domination of Silicon Valley by young Asian and Caucasian men seems foundational to the culture that built Facebook, YouTube, and the others. Chang cuts to the heart of the matter.
Highlight(orange) - Location 4354
Once I understood how persuasive technology works, I consulted two fantastic books on the psychological impact of persuasive technology on smartphones, tablets, and computers. Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, by Adam Alter (New York: Penguin Press, 2017), is comprehensive, well written, and easy to understand. It covers a wide range of harms across every age group. A must read. Glow Kids: How Screen Addiction Is Hijacking Our Kids—and How to Break the Trance, by Nicholas Kardaras (New York: St. Martin’s Press, 2016), focuses on kids. If you have young children, I predict this book will cause you to limit their exposure to screens and to protect them from a range of applications. Another important book is It’s Complicated: The Social Lives of Networked Teens, by danah boyd (New Haven: Yale University Press, 2014). This should be must reading for parents.