Facebook follows you everywhere

The Slow Death of ‘Do Not Track’

By FRED B. CAMPBELL Jr.    DEC. 26, 2014

HAYMARKET, Va. — FOUR years ago, the Federal Trade Commission announced, with fanfare, a plan to let American consumers decide whether to let companies track their online browsing and buying habits. The plan would let users opt out of the collection of data about their habits through a setting in their web browsers, without having to decide on a site-by-site basis.

The idea, known as “Do Not Track,” and modeled on the popular “Do Not Call” rule that protects consumers from unwanted telemarketing calls, is simple. But the details are anything but.

Although many digital advertising companies agreed to the idea in principle, the debate over the definition, scope and application of “Do Not Track” has been raging for several years.

Now, finally, an industry working group is expected to propose detailed rules governing how the privacy switch should work. The group includes experts but is dominated by Internet giants like Adobe, Apple, Facebook, Google and Yahoo. It is poised to recommend a carve-out that would effectively free them from honoring “Do Not Track” requests.

If regulators go along, the rules would allow the largest Internet giants to continue scooping up data about users on their own sites and on other sites that include their plug-ins, such as Facebook’s “Like” button or an embedded YouTube video. This giant loophole would make “Do Not Track” meaningless.

How did we get into this mess?

For starters, the Federal Trade Commission doesn’t seem to fully understand the nature of the Internet.

Online companies typically make money by utilizing data gleaned from their users to sell targeted ads. If the flow of user data slows down, so does the money. A study commissioned by the Interactive Advertising Bureau with researchers from Harvard Business School underscores the point: at least half of the Internet’s economic value is based on the collection of individual user data, and nearly all commercial content on the Internet relies on advertising to some extent. Digital advertising grew to a $42.8 billion business last year, a sum that already exceeds spending on broadcast television advertising.

Essentially, the collection of user data makes possible the free access to maps, email, games, music, social networks and other services.

Digital privacy advocates, understandably, view the online ecosystem differently. They are alarmed by the growth of the surveillance economy, in which companies compile and store information about what a user reads, looks for, clicks on or buys. In this world, disclosure is fairly meaningless, because almost no one reads the terms of service that define the relationship between the customer and the company.

The regulatory process is the wrong way to address this fundamental tension. If the government wants to shift the Internet economy away from a “barter” system (exchanging personal data for free services) toward a subscription-based system, Congress should take charge.

Even worse, the Federal Trade Commission has abandoned responsibility, all but throwing up its hands. Instead of leading the effort to write good rules, based on the broadest public participation, the commission has basically surrendered control of the process to the industry panel, the “tracking protection working group” of the World Wide Web Consortium, or W3C.

The outcome could be worse than doing nothing at all.

The industry recommendation is expected to distinguish between companies that have a “first party” relationship with users — consumer-facing Internet content providers and Internet service providers — and “third party” companies, which include most small advertising-technology companies.

First-party relationships would be created if the user “intends to interact” with the web company (or a service provider acting on behalf of that company). For example, logging into Facebook would count as a “user action” that would allow Facebook to track your activity “across multiple distinct contexts,” including other websites.

In contrast, companies with third-party relationships would have far more limited tracking abilities. For example, if a user visits a site that integrates an advertisement with content from other sources, the ad server would not be able to place a tracking “cookie” for marketing purposes on your device without your consent.

This dubious distinction would harm competition in the online ad market by turning “Do Not Track” into “Do Not Track for small ad companies only.” Google, Facebook and other large companies that operate both first- and third-party businesses would be able to use data they gather through their first-party relationships to compete in the third-party ad market. Smaller ad tech companies would be at a severe competitive disadvantage and could even be driven out of the market.

The Federal Trade Commission shouldn’t help pick winners and losers through a murky process that has devolved into an effort to protect the positions of Internet giants. It should stay focused on policing the behavior of companies that short-shrift consumers or restrict competition. If the industry group recommends a lopsided version of “Do Not Track,” as expected, the commission should not go along with it. The correct balance between privacy and competition is a decision better left to Congress than to a feckless regulator.

Fred B. Campbell Jr. is executive director of the Center for Boundless Innovation in Technology and a former chief of the Federal Communications Commission’s Wireless Telecommunications Bureau.

Facebook’s Troubling One-Way Mirror

Jim Rutenberg  MAY 22, 2016

Continue reading the main storyShare This Page

If you bothered to read the fine print when you created yourFacebook account, you would have noticed just how much of yourself you were giving over to Mark Zuckerberg and his$340 billion social network.

In exchange for an admittedly magical level of connectivity, you were giving them your life as content — the right to run ads around video from your daughter’s basketball game; pictures from your off-the-chain birthday party, or an emotional note about your return to health after serious illness. You also gave them the right to use your information to help advertisers market to you, based on your likely state of pregnancy or new place among the consciously uncoupled.

There are privacy protections. Facebook says it will not share your identity with advertisers without your permission. And you can set limits on what they can know. But at the heart of the relationship is a level of trust and a waiving of privacy that Facebook requires from its users as it pursues its mission to “make the world more open and connected.”
But how open is Facebook willing to be in return? The way it initially handled this month’s flare-up over accusations of political bias in its Trending Topics feed can only lead to this answer: not very.
And that should concern anyone of any political persuasion as Facebook continues to gain influence over the national — and international — conversation.
That influence comes through the astounding growth in its users — 1.6 billion people and counting. Increasingly, those users are spending time on Facebook not only to share personal nuggets with friends, but, for more than 40 percent of American adults, according to Pew Research Center, to stay on top of news, which flows in and out of their individually tailored and constantly updating Facebook News Feeds.
That has helped chip away at the centrality of destination news sites like The New York Times, The Washington Post, the right-leaning Daily Caller and the left-leaning Talking Points Memo. Their articles must now vie for attention in the Facebook algorithms that help determine which items will be prominent in which users’ feeds and which will be highlighted in the Facebook Trending section that is prominent on users’ home pages.
So Facebook, born of the open Internet that knocked down the traditional barriers to information, becomes a gatekeeper itself. It now has an inordinate power to control a good part of the national discussion should it choose to do so, a role it shares with Silicon Valley competitors like Google and Twitter.
It’s a privileged position that Facebook won through its own ingenuity and popularity. Mr. Zuckerberg seemed to approach this new perch with a solemn sense of responsibility when he took the company public in 2012, swearing in an investor letter, “We believe that a more open world is a better world.”
And yet…
There we were earlier this month, with Facebook ensnared in one of those big public relations crises for which openness is always the best salve. The report in Gizmodo that Facebook had a team of editorial contractors who injected their own judgment into its computer-generated Trending list — and at times suppressed “news stories of interest to conservative readers” — ran without a response from Facebook, which ignored Gizmodo’s detailed questions.
Then came the slow and awkward response. There was the initial statement that Facebook could find “no evidence” supporting the allegations; Facebook said it did not “insert stories artificially” into the Trending list, and that it had “rigorous guidelines” to ensure neutrality. But when journalists like my colleague Farhad Manjoo asked for more details about editorial guidelines, the company declined to discuss them.
Only after The Guardian newspaper obtained an old copy of the Trending Topics guidelines did Facebook provide more information, and an up-to-date copy of them. (They showed that humans work with algorithms to shape the lists and introduce headlines on their own under some circumstances, contradicting Facebook’s initial statement,Recode noted.) It was openness by way of a bullet to the foot.
As his staff prepared answers to pointed questions from Senator John Thune of South Dakota, Mr. Zuckerberg took another step into the sunshine last week by holding a grievance session at Facebook’s campus with conservative commentators and media executives, including the Fox host Dana Perino, the Daily Caller editor Tucker Carlson and the Blaze founder and commentator Glenn Beck, who wrote a defense of Facebook afterward.
Many of Mr. Zuckerberg’s visitors seemed at least temporarily placated by his explanation: That Facebook had so far found no systemic attempt to excise conservative thought from the Trending list and that any such move would harm Facebook’s primary imperative (which is, in lay terms, to get every single person on earth to spend every waking moment on Facebook and monetize the living expletive out of it).
But a more important issue emerged during the meeting that had been lying beneath the surface, and has been for a while now: the power of the algorithms that determine what goes into individual Facebook pages.
“What they have is a disproportionate amount of power, and that’s the real story,” Mr. Carlson told me. “It’s just concentrated in a way you’ve never seen before in media.”
What most people don’t realize is that not everything they like or share necessarily gets a prominent place in their friends’ newsfeeds: The Facebook algorithm sends it to those it determines will find it most engaging.

For outlets like The Daily Caller, The Huffington Post, The Washington Post or The New York Times — for whom Facebook’s audience is vital to growth — any algorithmic change can affect how many people see their journalism.

A cautionary tale came in 2014. The news site Upworthy was successfully surfing the Facebook formula with click baitheadlines that won many eyeballs. Then a change in the Facebook algorithm punished click bait, which can tend to overpromise on what it links to. Steep traffic drops followed. (Upworthy has recovered, in part by relying on more on video.)

Throughout the media, a regular guessing game takes place in which editors seek to divine how the Facebook formula may have changed, and what it might mean for them. Facebook will often give general guidance, such as announcing last month that it had adjusted its programming to favor news articles that readers engage with deeply — rather than shallow quick hits — or saying that it would give priority to live Facebook Live videos, which it is also paying media companies, including The New York Times, to experiment with.

This gives Facebook enormous influence over how newsrooms, almost universally eager for Facebook exposure, make decisions and money. Alan Rusbridger, a former editor of The Guardian, called this a “profound and alarming” development in a column in The New Statesman last week.

For all that sway, Facebook declines to talk in great detail about its algorithms, noting that it does not want to make it easy to game its system. That system, don’t forget, is devised to keep people on Facebook by giving them what they want, not necessarily what the politicos or news organizations may want them to see. There can be a mismatch in priorities.
But Facebook’s opacity can leave big slippery-slope questions to linger. For instance, if Facebook can tweak its algorithm to reduce click bait, then, “Can they put a campaign out of business?” asked John Cook, the executive editor of Gawker Media. (Gawker owns Gizmodo, the site that broke the Trending story.)

No Facebook executive would discuss it with me on the record. That’s not the only reason this column may seem a little cranky. My Facebook Trending list this week included this beaut: “Gastroesophageal Reflux Disease.” First of all, it was dyspepsia, and it was, like, 20 years ago. See that? I shared something pretty revealing. Your turn, Mr. Zuckerberg.

Facebook NASDAQ

Facebook CEO Admits To Calling Users ‘Dumb Fucks’

Ryan Tate

September 20, 2010

Mark Zuckerberg admits in a New Yorker profile that he mocked early Facebook users for trusting him with their personal information. A youthful indiscretion, the Facebook founder says he’s much more mature now, at the ripe age of 26.

“They trust me — dumb fucks,” says Zuckerberg in one of the instant messages, first published by former Valleywag Nicholas Carlson at Silicon Alley Insider, and now confirmed by Zuckerberg himself in Jose Antonio Vargas’s New Yorker piece. Zuckerberg now tells Vargas, “I think I’ve grown and learned a lot” since those instant messages.

And yet the old quote resounds precisely because Facebook continues to stir up privacy controversies at regular intervals. Zuckerberg justifies his privacy rollbacks by saying the social norms have changed in favor of transparency, but, as tech executive Anil Dash tells the New Yorker, that sort of change is much more appealing for a privileged, Ivy Leaguer golden boy of Silicon Valley like Zuckerberg than for his half a billion users, many of whom work for less tolerant bosses and socialize in more judgmental circles.

 

SiteLock