I Read A Book: Frank Pasquale’s The Black Box Society: The Secret Algorithms that Control Money and Information

bigdataby Raizel Liebler

What rights do people have to accuracy in their publicly available information? What limits should there be for who has access to our data souls?

Frank Pasquale’s The Black Box Society: The Secret Algorithms that Control Money and Information (Harvard University Press 2015) is an important book, not only for those interested in privacy and data, but also anyone with larger concerns about the growing tension between transparency and trade secrets, and the deceptiveness of pulling information from the ostensibly objective “Big Data.”

Starting from issues involving digital reputations, Pasquale writes about how the “black boxes” that companies containing personal information can be morphed, queried, and molded in ways that can negatively impact both individuals and groups. Presently, under U.S. law, people don’t own their own data – the data brokers do – and this allows for both inaccuracies and more surveillance. But database-gathered secret information “is valuable only if it is exclusive, and it remains exclusive only if the full power of the state can be brought to bear on anyone who discloses it without authorization” (215)

But The Black Box Society also writes about the underlying inequities that supposedly neutral coding hard-bakes into so many systems, including financial systems and transactions. While that is far from my area of expertise, these sections are hard-hitting – and will be of interest to those interested in the commodification and legalizing of inequality. Pasquale writes about how these hidden algorithms aren’t objective at all – instead “subtle but persistent racism” or other biases “may have influenced the past” and initial settings for seemingly objective algorithms – and then what happens now impacts “present [] models as neutral, objective, nonracial indicia.” (41)

The Black Box Society also touches on other issues of inequality – such as the ways that sharing information of some people has more of an impact on those people than others. (For more about the gendered aspect of online reputation and dispersal of information, I strongly suggest Danielle Citron’s Hate Crimes in Cyberspace – and the recent accompanying Boston University Law Review symposium.)

One of the most important aspects of The Black Box Society builds on the work of Siva Vaidhyanathan and others to write about how relying on the algorithms of search impact people’s lives. Through our inability to see how Google, Facebook, Twitter, and other companies display information, it makes it seem like these displays are in some way “objective.” But they are not. Between various stories about blocking pictures of breastfeeding moms, blocking links to competing sites, obscurity sources, and not creating tools to prevent harassment, companies are making choices. As Pasquale puts it: “at what point does a platform have to start taking responsibility for what its algorithms go, and how their results are used? These new technologies affect not only how we are understood, but also how we understand. Shouldn’t we know when they’re working for us, against us, or for unseen interests with undisclosed motives?”

Pasquale also argues that obscuring information also make it easier for those in positions of power to focus not on what would do good, but what appears to benefit those not only in positions of power, but specifically avoiding the creation of data systems that benefit those they claim to want to support. In regards to the recently defeated SOPA legislation, Pasquale asks “What does it say about our Congress that it is readier to turbo-charge a police state, largely in the service of content industry oligopolists, that is to revise and expand a venerable licensing method to support struggling journalists, artists, and musicians?” (203) The present payment model for those who receive royalties and license fees is so clearly broken, but data – and data engineers – could do much good if the focus was on actually paying those that create content.

Much of what Pasquale is concerned about could be changed easily. But to do so, would take those who work with data to understand how their work could potentially be used – and to create greater degrees of privacy through obscuring information. After all, more precise granularity exposes private information in a creepy way. Additionally, systems are created by people, and even Pasquale writes about how his suggestions are dismissed at the Silicon Valley companies where he raises these issues. To make the changes happen that are suggested in The Black Box Society will take intense social pressure – and perhaps a Supreme Court that understands that sharing inaccurate information is a violation of as-of-today yet-to-be defined privacy right that is nevertheless in the Constitution.

Summary: An essential book for understanding the algorithms that control much of our lives – from financial institutions to health information. Overall, the impact of this book will only come over years, with academics, policy makers, and others, quoting and implementing The Black Box Society’s suggestions.

Leave a comment