If Facebook opened its algorithms to regulators, would it make a difference?
Short answer: Yes, according to a top computer science professor. But the move alone would be insufficient.
The big news this week is that Facebook is planning to change its company name in the coming days. Alex Heath of The Verge, who broke the story, reported on Tuesday that the move reflects CEO Mark Zuckerberg’s focus on building the metaverse — a term coined by sci-fi novelist Neal Stephenson to describe “a virtual world people escape to” from the real world — and would likely position the flagship Facebook app “as one of many products under a parent company overseeing groups like Instagram, WhatsApp, Oculus and more.” Casey Newton at Platformer confirmed Heath’s scoop with his own sources and adds that “the news could come as late as Oculus Connect, next Thursday; or as early as Facebook’s earnings call, coming this Monday.”
Despite the buzziness of these revelations, my attention has remained fixed on comments Nick Clegg, Facebook’s vice president of global affairs, made a couple of weeks ago when he indicated the company would be open to increased oversight of its algorithms. Facebook is engulfed in another cycle of intensifying criticism that the company prioritizes profit over the well-being of people who use its family of products. And for some reason, it thinks Clegg, whom I’ve been deeply critical of in previous columns, is persuasive enough to convince skeptics otherwise. The algorithms “should be held to account, if necessary by regulation so that people can match what our systems say they’re supposed to do from what actually happens,” Clegg said two Sundays ago on CNN’s State of the Union.
When I heard Clegg’s comments, I immediately wondered if regulators would be able to learn anything meaningful if Facebook did open its algorithms. I was also curious what an expert in the space — with obviously more proficiency than most members of Congress, even those who have jurisdiction over these issues — would be interested in knowing about the company’s processes if they had access to them.
So I reached out to Jason Hong, a professor at Human-Computer Interaction Institute in the School of Computer Science of Carnegie Mellon University, to get his take. Hong believes Facebook should open up its algorithms, but that the move alone would be insufficient. “If regulators want to stem the spread of things like hate speech, disinformation and depression, they also need to take a close look at the processes Facebook uses to develop products and the metrics they measure.”
As Hong explained to me, part of the issue is that the core of Facebook’s NewsFeed algorithm is probably not inspectable. “This is just my best guess: The NewsFeed algorithm is some kind of giant N-dimensional machine-learning model that our poor primitive brains have no chance to really understand.” Facebook did not respond to multiple requests for comment from Supercreator News.
Researchers and lawmakers could understand elements like the number of people who can posts from an advertiser. But according to Hong, the core ranking and selection formulas considers factors such as the recency of posts, number of links, number of likes from people similar to you, the emotional valence of the words in the post, for example, and none of those strands on their own are dubious when considered in isolation.
But as we know from what was disclosed by Frances Haugen, the latest Facebook whistleblower, and has been discovered by other academics, the NewsFeed algorithm seems to prioritize posts that are likely to have high engagement. “The problem is that posts that get people angry tend to have really high engagement and spread quickly, and that includes posts about politics, especially misinformation and disinformation,” Hong said. “All of this also doesn’t take into account other issues with Instagram, namely cyber-bullying, or the number of likes one west being used to validate oneself.”
So, given all this, Hong condensed his considerations into five questions for laypeople like you and me.