“The Internet is broken.”
Walter Isaacson, LinkedIn
“Tim Wu: ‘The Internet is the classic story of the party that went sour’ ”
John Naughton, The Guardian
“The Internet has a dark side.
We need a plan for taming it”
World Economic Forum Annual Meeting 2017
“How the Internet Is Loosening Our Grip on the Truth”
Farhad Manjoo, The New York Times
“Today’s Internet is Optimized for Noise”
Andy Bromberg, CEO of Sidewire
“The Facebook Algorithm Is Watching You.
Here’s one way to confuse it.”
Adrienne Lafrance, The Atlantic
These eye-catching headlines are just a sample of the headlines that have dominated the news recently. So the secret is out. The Internet can no longer be seen as a utopian place where everyone is equal and freedom is the status quo. Not everything that has to do with technology is simply a “technical” issue. “The technical is political.” There are always values and politics involved in the structure, design and architecture of technology; put differently, these issues are socio-technical and have a deep effect on what sorts of citizens we become.
A comprehensive overview of the issues that will shape the Internet’s future is conducted by the Winter 2016 issue of Dædalus,The Journal of the American Academy of Arts & Sciences, which is devoted to the Internet and curated by Harvard Professor Yochai Benkler and MIT computer scientist and early architect of the Internet David Clarke. One of the key themes that emerge from the issue is that just as the original design choices that engineers faced in the early days of the Internet shaped what the Internet became, i.e. a “‘general purpose’ network, designed for a variety of uses,” so will ongoing and future design choices shape its future. Benkler further explains that the original design choices of the Internet favored “decentralization of power and freedom to act” at the expense of control, and thus maximized entrepreneurial activity and innovation. Market developments have introduced new points of control and will continue to do so; as such, future design choices will inevitably be subject to conflicts of interests between governments, corporate stakeholders and Internet users. The resulting choices will ultimately represent the power dynamics of the environment in which they are made.
For Benkler, power is neither good nor bad, but simply “the capacity of an entity to alter the behaviors, beliefs, outcomes, or configurations of some other entity.” What is crucial however, is an effort to continuously identify points of control as they emerge and devise mechanisms that maintain “degrees of freedom” in the network. We are now in the “Age of Big Data,” and the technologies and methods that fall under the catchphrase represent one of these “control points.” While a precise definition of the term “Big Data” may be elusive, and the uses, tools and techniques associated with big data are wide-ranging, it is helpful to think of the term as reflecting “a paradigm [more] than a particular technology, method, or practice.” Viewed this way, “big data […] is a way of thinking about knowledge through data and a framework for supporting decision making, rationalizing action, and guiding practice.” As such, Big Data may ultimately allow very few actors to “predict, shape, and ‘nudge’ the behaviors of hundreds of millions of people.” These actors are the few entities large enough to access, control, collect, and analyze vast amounts of data.
In today’s information environment, machine-learning algorithms that conduct predictive analytics based on some type of data mining are used in just about every context. They have infiltrated areas such as employment, education, criminal justice, medicine, insurance, retail, media, and culture. At a very broad level such algorithms “learn” from the past by analyzing it and taking into account what they deem as statistically significant to produce predictions of the future. They are trained to ignore outliers and assume that what has been will be, oftentimes inheriting or creating biases in the process.
While our society is not yet at the point where all of our experiences happen in controlled online environments or when we are “connected,” the instances and the complexity with which Big Data technologies are involved in our lives are increasing at an unprecedented pace. The speed of technological breakthroughs we are currently experiencing has no historical precedent; for some, we are at the early stages of a “Fourth Industrial Revolution” that is characterized by a convergence of the digital, physical and biological spheres, is evolving at an exponential rather than a linear pace and “will fundamentally alter the way we live, work, and relate to one another.”
Our online experiences are not a simple series of one-to-one relationships with each service we use, but increasingly are more integrated. “Big Data collection and processing, combined with ubiquitous sensing and connectivity, create extremely powerful insights on mass populations, available to relatively few entities.” Scholars such as Zeynep Tufekci explain that when “these methods [are] combine[d] with widespread experimentation […], behavioral science that analyzes individuals in a stimulus-response framework and increasingly on-the-fly personalization of platforms, platform companies can nudge users to form beliefs and preferences, follow behaviors, and increase the probability of outcomes with ever-finer precision.”
Viewed against this background, today’s “design choices” will fundamentally shape the form and structure of our society. The focus of this paper is on the design choices present in the current media environment and more specifically, the social media environment that has created a new type of “platform press.” The argument advanced is that as platforms have now become significant distributers of news, the existing design choices that have been made in this context can threaten the viability of a functioning marketplace of ideas as well as the possibility for true choice about receiving valued information.
A recent illustration of the concerns is the issue of fake news in the 2016 presidential election, which brought to the surface a big debate about whether such platforms are in fact media companies, what kind of responsibilities they should bear, the role of section 230 of the Communications Decency Act, and the correct policy approach.
Part I of the paper will provide background information on the current social media environment; the algorithmic filtering that takes place in the curation of news, and the problems that arise from the way it is set up; and will conclude with a call for regulation in the space. Part II will proceed to examine potential objections to regulation in this area, and argue that such objections are not irrefutable. Using examples of regulations that been introduced in somewhat analogous circumstances in the past, Part II will conclude that some types of regulation are constitutionally permissible and can further promote social and constitutional values.
Sofia Grafanaki, Drowning In Big Data: Abundance Of Choice, Scarcity Of Attention And The Personalization Trap, A Case For Regulation, 24 Rich. J.L. & Tech. 1 (2017)
PDF available at:
Richmond Journal of Law & Technology
Available online at:
University of Richmond – Journal of Law and Technology