[A teachable moment in activism that’s an important read to see all the swamp monster machinations that Silicon Valley puts us all through. The post is extremely well-written but does take a bit of a commitment to read to the end. Highly recommended that you stick with it to the end of the story.]
The way Alastair Mactaggart usually tells the story of his awakening — the way he told it even before he became the most improbable, and perhaps the most important, privacy activist in America — begins with wine and pizza in the hills above Oakland, Calif. It was a few years ago, on a night Mactaggart and his wife had invited some friends over for dinner. One was a software engineer at Google, whose search and video sites are visited by over a billion people a month. As evening settled in, Mactaggart asked his friend, half-seriously, if he should be worried about everything Google knew about him. “I expected one of those answers you get from airline pilots about plane crashes,” Mactaggart recalled recently. “You know — ‘Oh, there’s nothing to worry about.’ ” Instead, his friend told him there was plenty to worry about. If people really knew what we had on them, the Google engineer said, they would flip out….
Facebook and Google were following people around the rest of the internet…using an elaborate and invisible network of browsing bugs — they had, within little more than a decade, created a private surveillance apparatus of extraordinary reach and sophistication. Mactaggart thought that something ought to be done. He began to wonder whether he should be the one to do it….
Almost by accident, though, Mactaggart had thrust himself into the greatest resource grab of the 21st century. To Silicon Valley, personal information had become a kind of limitless natural deposit, formed in the digital ether by ordinary people as they browsed, used apps and messaged their friends. Like the oil barons before them, they had collected and refined that resource to build some of the most valuable companies in the world, including Facebook and Google, an emerging duopoly that today controls more than half of the worldwide market in online advertising. But the entire business model — what the philosopher and business theorist Shoshana Zuboff calls “surveillance capitalism” — rests on untrammeled access to your personal data. The tech industry didn’t want to give up its powers of surveillance. It wanted to entrench them. And as Mactaggart would soon learn, Silicon Valley almost always got what it wanted.
Anyone concerned with the anticompetitive state of digital advertising, and how to fix it, should focus like a laser on the circumstances surrounding the 2014 FTC’s pass on formally investigating if the Facebook-WhatsApp acquisition would “substantially lessen competition” under the Clayton Antitrust Act.
That obvious FTC mistake in hindsight, triggered a winner-take-all domino effect that not only tipped Facebook to a social advertising monopoly, but also tipped the overall digital advertising market to the anticompetitive digital advertising cartel that evidently predominates today.
As Facebook sought to become the world’s dominant social media service, it struck agreements allowing phone and other device makers access to vast amounts of its users’ personal information.
Facebook has reached data-sharing partnerships with at least 60 device makers — including Apple, Amazon, BlackBerry, Microsoft and Samsung — over the last decade, starting before Facebook apps were widely available on smartphones, company officials said. The deals allowed Facebook to expand its reach and let device makers offer customers popular features of the social network, such as messaging, “like” buttons and address books.
But the partnerships, whose scope has not previously been reported, raise concerns about the company’s privacy protections and compliance with a 2011 consent decree with the Federal Trade Commission.
They dangle the 4 per cent of annual turnover fines as a maximum possible penalty – €3.7bn, €1.3bn, €1.3bn, and €1.3bn, respectively – though regulators have stressed they won’t be handing out the top level fines willy-nilly.
If you have been to any media conference or read any trade magazine over the past several years, you would know that ‘big data’ was going to be the answer to all of the woes of the music industry.
The idea was simple: utilizing the collection of large amounts of data and profiling those generating that data (citizens) could inform you how to extract more value from your customers.
Essentially, big data was a solution pitched and sold to the music industry as a panacea to fan engagement problems….While big data seems very attractive, using personal data and profiling fans may in fact turn out to be, like oil and plastics, already outdated and toxic.
A federal judge in California has ruled that Facebook can be sued in a class-action lawsuit brought by users in Illinois who say the social network improperly used facial recognition technology on their uploaded photographs.
The plaintiffs are three Illinois Facebook users who sued under a state law that says a private entity such as Facebook can’t collect and store a person’s biometric facial information without their written consent. The law, known as the Biometric Information Privacy Act, also says that information that uniquely identifies an individual is, in essence, their property. The law prohibits a private entity from selling, leasing, trading or otherwise profiting from a person’s biometric information.
U.S. District Judge James Donato ruled that the lawsuit can proceed as a class action representing potentially millions of Facebook users in Illinois. The judge is based in San Francisco where the case had been moved at Facebook’s request.
Last month, CreativeFuture asked you, our followers, what you thought about platform responsibility. Little did we know that, in the meantime, the issue would start taking over the front pages of our newspapers and websites!
In a nutshell, the issue is whether Google, Facebook, and their Silicon Valley peers should take responsibility for the ways their platforms are used to violate our laws and harm society.
Even before the House and Senate passed landmark legislation to demand accountability from the tech giants and even before Facebook’s Cambridge Analytica mess exploded, we asked your views on a few simple questions that came down to one thing: do you believe that Google and Facebook should be more responsible?
The answer, overwhelmingly, was that you do – and you had a lot to add in response. Here are just some of your comments:
“The organizations who own these platforms make enormous profits. They have a responsibility to make sure the platforms are not being used to harm others.”
“They have the greatest ability to do so. And a moral responsibility. Just because it’s a newer technology doesn’t exempt them.”
“Because if they are able to control it, and I believe that they can, then they should be held accountable and responsible if they don’t.”
“They are providing the service that is being used for these malicious acts. They are responsible! They need to find a solution and be held accountable!”
“Violations of the law should be prosecuted. To avoid prosecution, they should take proactive steps to prevent violations.”
“They created these platforms, they should be responsible for them. They are beyond wealthy from them and can afford to police them. U.S. laws should apply everywhere in the U.S., including [the internet]!”
“Times change, services change, service providers change. Rules must keep up with changes.”
“Hostile foreign governments are using internet social platforms to publish untrue propaganda in order to destabilize our nation … if they can’t or won’t [monitor their platforms], they should be heavily fined and shut down. It is their responsibility for doing business in this country.”
“Responsibility is part of having a business.”
“[Google and Facebook] are no different from any other corporation which has the responsibility not to enable breaking the law. They are complicit and just a guilty as those breaking the law.”
“I can’t believe we even have to ask this question. I am sick and tired of corporations bearing no responsibility for the effects of their services on people. If a crime is occurring and the corporation looks the other way, that cannot be allowed any longer.”
“They don’t want the responsibility of accountability because complying would eat into profits with no returns. So, it will NEVER happen unless it is legislated.”
“The internet has become perhaps the single most important source of information and communication in the world. It cannot just rake in profits and not be responsible for what they have created.”
This week, on April 10 and 11, Facebook’s CEO Mark Zuckerberg will testify twice before Congress on the issues facing his company, and Silicon Valley generally. We expect that Zuckerberg will be very well prepped by his army of lawyers. We anticipate that he will try to reassure Congress that Facebook is doing all it can to (1) protect the privacy of its users; (2) prevent foreign influence on its advertising networks; and (3) stop rampant violations of the law from being carried out on their platform.
But Congress should not settle for head-pats and platitudes. They need to ask some hard and direct questions. We hope they will include the following…