Before it recedes further into the past, I want to get back to the testimony of Mark Zuckerberg, founder and head of Facebook, before two Senate committees. The central issue was consumer choice. But that issue entirely missed the problem that precipitated the hearing – the behavior of Cambridge Analytica using data to manipulate user preferences for the 2016 election.
Members of the Senate Judiciary and Commerce Committees questioned Zuckerberg for hours about the privacy of consumer choices. Privacy could solve the problem if the law or our Facebook choices prevented entities like Cambridge Analytica from getting the information they needed to manipulate our preferences. But that won’t happen. Several senators asked Zuckerberg about allowing users to control their information with European style opt-in rules. Consumers can opt out of various services now and Zuckerberg accepted the value of opt in rules. In many circumstances Facebook already requires something like opt in.
None of the committee members got at the underlying problem of manipulation of user preferences by the apps unleashed on Facebook. Those apps were based on extensive testing of users who responded to a program seeking information, then Cambridge Analytica generalized its findings from that program to other Facebook users it had detailed information about, and used what it had learned to redirect their preferences. It is tempting to think that the fundamental problem was the violation of standards of user privacy. Unfortunately, user control doesn’t solve the problem, either in theory or in practice.
It falls short in theory because no one has the right to sell his or her vote. Society works hard to prevent people from selling votes because it fouls up our political system. Delegates to the Constitutional Convention spoke about owner control of worker votes in English factories. Anything that would allow one or a few to control the votes of others destroys meaningful democratic choice and does so on behalf of those who may have bad intentions for the people and their democratic system. The paper ballot was standardized over a century ago to make it harder to pay people for their votes or for voters to extract a fee. So voter/user control does not solve the problem in theory.
Internet businesses that collect data and use it to manipulate us present the same problem. People don’t have the right to consent to their own political manipulation any more than they have the right to agree to sell their votes and for the same reason. That should be the starting point of discussion, not consent. Consent doesn’t sterilize corrupt anti-social transactions.
Voter control doesn’t solve the problem in practice because few voters seem sufficiently interested in keeping their data private to cut off Cambridge Analytica types of manipulation. Plus the complexity of Facebook compounds the problem.
As a result, most discussion about privacy doesn’t go nearly far enough.
Committee members did hit paydirt with questions about users who should have been banished, shut out or stopped from misusing the system, like users touting violence or users targeting Rohingya in Myanmar. These are serious issues, but the risk of foreign manipulation, was largely undefined and unaddressed.
The hard question is what manipulation is and how it can be regulated consistent with the First Amendment. Those are also serious questions and the answers aren’t obvious. But privacy is a distraction.
Stephen Gottlieb is Jay & Ruth Caplan Distinguished Professor Emeritus at Albany Law School. A widely recognized constitutional scholar, he has served on the New York Civil Liberties Union board and the New York Advisory Committee to the U.S. Civil Rights Commission. His latest book is Unfit for Democracy: The Roberts Court and The Breakdown of American Politics.
The views expressed by commentators are solely those of the authors. They do not necessarily reflect the views of this station or its management.