Legislation Can’t Fix Social Media
It’s up to users to vote with their feet if they are unhappy.
It is perhaps apocryphal, but when reportedly commenting on Russia’s peasant soldiers deserting the tsar’s army during Russia’s Revolution between 1905-1906, Vladimir Illich Lenin—the founder and first head of government for Soviet Russia—quipped, “They voted with their feet.”
In 1956, economist Charles Tiebout argued in his essay A Pure Theory of Local Expenditures, that a similar type of “foot voting” (a term he does not use), where consumers willfully relocate to jurisdictions where policies better align with their preferences, is more efficient and effective than voting to change government or its policies. Ronald Reagan, agreeing with Tiebout, often said Americans should “move along” if their local government was not to their liking.
This is of course consistent with our diverse federal system of government so eloquently summarized by U.S. Supreme Court Justice Louis Brandeis, who said, “a single courageous State may, if its citizens choose, serve as a laboratory; and try novel social and economic experiments without risk to the rest of the country.” Legal scholar Ily Somin says these laboratories of democracy and the migratory options they enable are, therefore, “a tool for enhancing political freedom: the ability of the people to choose the political regime under which they wish to live.”
My point in highlighting this context is to bring into sharp relief this truth: Americans have agency and the opportunity to improve their lives more efficiently and effectively than government can.
This truth, however, is no longer operative in the minds of many citizens. Its light is seemingly fading under gathering clouds of political anxiety, frustration, and desires for retribution—as clearly demonstrated by a new bipartisan bill in the House of Representatives.
The Filter Bubble Transparency Act—offered by Reps. Ken Buck (R-Colorado), David Cicilline (D-Rhode Island), Lori Trahan (D-Massachusetts), and Burgess Owens (R-Utah)—would force companies with more than 500 employees, annual profits of more than $50 million, and with more than 1 million annual users, to offer a version of their services that does not employ “opaque algorithms.”
“Consumers should have the option,” says Rep. Buck, “to engage with internet platforms without being manipulated by secret algorithms driven by user-specific data.”
The bill’s text is a hot mess of legalese whose implementation would undoubtedly be every bit as opaque as the algorithms it intends to manage. What is clear, however, is the bill will not work even if it could be coherently implemented.
The core conviction of this bill is helpfully explained by one of its co-sponsors, Rep. Cicilline:
“Facebook and other dominant platforms manipulate their users through opaque algorithms that prioritize growth and profit over everything else. And due to these platforms’ monopoly power and dominance, users are stuck with few alternatives to this exploitative business model, whether it is in their social media feed, on paid advertisements, or in their search results."
What Cicilline fails to realize is that the company he uses as an example, Facebook, already provides a version of their platform that does not employ user-informed algorithms to curate content. On this version of Facebook, content is posted to a user’s feed chronologically without any other “user-specific data” to raise or lower its prominence.
And guess what? Users hate the service but Facebook actually makes more money—the exact opposite outcomes the bill’s authors intend.
According to papers leaked by whistleblower Frances Haugen, the social media company conducted an experiment in 2018 where it effectively turned off its News Feed ranking algorithm for .05 percent of users. Its findings showed that engagement dropped precipitously, users hid 50 percent more posts (meaning they found these posts to be irrelevant or uninteresting), use of Facebook Groups—where some of the most extreme and concerning content resides—skyrocketed, and Facebook actually made more money on advertising because users had to scroll longer to find the content they were looking for, and therefore, were exposed to more ads.
Put simply: The prescription offered by the Filter Bubble Transparency Act has already been tried and would actually worsen the bill’s animating concerns, not make them better.
It is tempting to dismiss self-defeating attempts like this one as another example of Congress’ ineptitude and total lack of understanding of even the most basic aspects of internet platforms. And while this critique is more than justified, bills like these are responding to a real dynamic that should be better understood.
Algorithms are defined by their objective function—the outcome they are intended to maximize or minimize. User-specific data is fed into these algorithms to enable them to more effectively realize their objective functions. In the case of Amazon, your previous purchases, browsing habits, and social and demographic information are used to provide you with tailored recommendations that anticipate your interests and improve the likelihood of a purchase; this is the algorithm’s objective function. The objective function of Twitter’s algorithm, likewise, is to serve up content that keeps you on the platform longer and makes you more likely to engage with other Twitter users.
Importantly, it is not just Big Tech that uses algorithms like these that are informed by user data. Fox News, MSNBC, CNN, the Washington Post, the New York Times, One America News, and virtually every other media organization purchases and uses data that allow them to deploy algorithms with powerful objective functions aimed at maximizing user engagement.
The problem is that normal market dynamics and weaknesses in the underlying technology incentivize objective functions that are too narrow and that give rise to the concerns that are bubbling up in society and in Congress.
A profit-based company in a free-market economy will always be incentivized to deploy algorithms with narrow objective functions aimed at growing user purchases and engagement. On the one hand, this means we can find and purchase goods that we need or want with more variety, lower cost, and less friction than at any other time in human history. On the other hand, it also means that our culture’s appetite for rage, sex, and violence increasingly defines our news and entertainment.
These algorithms are not manipulating us. They’re giving us exactly what we want. But this does of course reinforce our worst tendencies and accelerate social fracturing and political tribalization.
Some technologists are trying to build algorithms with more nuanced objective functions that not only surface desirable content and improve profits, but that also account for users’ “well-being.” But building complex objective functions that account for “human thriving,” “time well spent,” and “happiness” is very difficult and is hindered by the inherent reality that such algorithms will, at least in the short-term, cause companies to make less money.
This is why, some will respond, government regulation is necessary—because the market will not choose this course on its own. But, as we have already seen, the government’s prescriptions are not only ineffective, they actually pour gas on the fires we are trying to extinguish.
So what can we do?
A clear market signal is more powerful than a government regulation. If a user truly believes they are being manipulated or abused by an internet platform, then they should leave that platform. If you want to be served by algorithms that leverage more complex objective functions, then demonstrate your rejection of the status quo by refusing to be served by those with overly narrow objective functions.
If one is unwilling to do this, then perhaps the grievance is more forcefully argued than actually felt.
Some will contend that this would be abandoning the field and ceding these platforms to political opponents. I am less convinced this is true, but even if it is, Thomas Sowell’s maxim holds: “There are no solutions, only tradeoffs.”
And what would one get by walking away from social media? Countless studies now demonstrate that those who make this break have “a significantly higher level of life satisfaction,” greater “subjective well-being,” and even sleep better.
The fear of missing out is itself a product of efforts to convince us that not being on these platforms is somehow decisively damaging to us. That is simply not true. And if we actually want these platforms to be better, we should aggressively exercise that wonderful power that a communist can only observe, but that an American can boldly assert—the power to choose the “regime under which they wish to live.”
At the end of it all, the Filter Bubble Transparency Act is a misguided response to a real challenge. But it is a challenge that we can address ourselves, without clumsy and coercive government action.
All we have to do is vote with our feet.