Social apps and media organizations need new organizing principles
The response from Facebook and Twitter to The New York Post’s shady Hunter Biden story shows us the consequences of optimizing for speed and reach.
For the past week, the dominant news story at the intersection of politics, technology and media has been the response from tech companies to a dubious story by The New York Post that focused on Hunter Biden’s ties to Ukrainian energy company Burisma. The Post’s claims were based on alleged emails and photos from Biden’s personal laptop. Facebook reduced the reach of the story; Twitter banned linking to the story at all (before eventually reversing course). And in the days since, tech VIPs like Casey Newton and Ben Thompson have published definitive essays on the merits of Facebook and Twitter’s interventions. So I’ll avoid duffing up what’s already well-worn terrain. Instead, let’s take a step back and consider why tech companies and media organizations keep finding themselves on the ropes to begin with.
For starters, remember that the two organizing principles they’re optimized around are speed and reach. I report and write these essays under intense deadlines; but relatively speaking, I take my time. The hours (and sometimes days) I spend working on a post would be an unnerving luxury to most beat reporters at Big Media publications. And I understand the alluring payoff of quick turnarounds as the spoils of the new economy go to the stories that trend fast and far. But there is little news of consequence so valuable that it must be immediately absorbed and shared. Granted I’m an outlier, but I’ve lived nearly three years without Facebook and Instagram or iPhone and desktop notifications and breaking news still find me wherever I am in any given moment. But the economics reward speed and reach so we’ve somehow learned to reconcile that peddlers of mis- and disinformation — the most infamous of whom is Donald Trump — who know that they can tweet or post some bullshit instantaneously and have it picked up by their fans and critics alike are just par for the digital course.
Even if people want to be more discerning, it’s almost impossible because all social posts look the same even though they have different functions. Whether you share news, memes, misinformation or hate speech, it’s all decontextualized as a Tweet or News Feed update until it reaches enough people to editorialize it. It’s in this period between the time something is published and realized to be problematic where bad-faith actors thrive. Meanwhile, journalists have responded with a 200 percent increase in the number of fact-checking organizations that have launched since Trump was elected in 2016. But as Nathan Walter, a disinformation researcher at Northwestern University, said to Mathew Ingram of the Columbia Journalism Review a year ago, fact-checking works in the sense that “people’s beliefs become more accurate and factually consistent” after seeing a fact-checking message.” And attempts to add relevant context, a popular recommendation from the “the solution to bad speech is more speech” crowd, can actually make the problem worse according to Walter’s research.
Few preemptive guardrails discourage the spread of misinformation, anyway. We all have access to the same apps with the same features that can influence the same communities. So tech companies find themselves on defense by default and journalists find themselves jostling for attention alongside whatever else has been given the same weight as their professional output. Social apps are buoyed with billions’ of dollars worth of investments in content moderation. But attempts to moderate often make tech companies an easy target for the kind of scapegoating conservatives have become famous for. And it’s to be determined how effective Facebook’s Oversight Board, which CEO Mark Zuckerberg designed to serve as “almost like a Supreme Court [that] is made of up of independent folks who make the judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” since it will only decide whether content should be allowed or removed from Facebook or Instagram when it starts hearing cases. However, as I wrote in May, it’s the content that’s left up that often has the most adverse impact anyway.
There are no obvious solutions that would please everyone. But I think a few would be worth discussing at least and experimenting with at most. First, it would be nice if social feeds organized posts into categories or labeled them so users had a context of what it was — outside of the account that posted the media — before reading, listening or watching it.
I’d also be a fan of social apps updating their feeds periodically by default — let’s say every fifteen minutes — instead of in real time. Safiya Umoja Noble, a professor at the University of California at Los Angeles calls this “slow media” and told Annalee Newitz of The New York Times last year that it “would give human moderators or curators time to review content. They could quash dangerous conspiracy theories before they lead to harassment or worse. Or they could behave like old-fashioned newspaper editors, fact-checking content with the people posting it or making sure they have permission to post pictures of someone.” Users could pay for social apps to update their feed everyone, two or 10 minutes as a premium upgrade. (Another suggestion from sci-fi writer John Scalzi via Newitz’s Times article: Online profiles beginning with everything and everyone blocked by default and news and entertainment reaching you only after you opted into them would protect users from “viral falsehoods, as well as mobs of strangers or bots attacking someone they disagree with.”)
And I’ve floated this idea before, but it’s worth mentioning again: I wonder if users would feel deterred from posting and spreading information if doing so resulted in Twitter applying a warning label or badge to their profile in addition to the offending Tweet — maybe a red X designed like the blue verified checkmark? What if Twitter profiles had an Uber-like rating to reward the most respectful and healthy users? I’m unsure of the technology required to realize these ideas — but at face value, they would hold users accountable for what they post.
For perspective, it’s helpful to recognize that social apps aren’t the internet and the internet isn’t the filter by which most people interpret their daily lives or prioritize their politics. (A study from the Pew Research Center found that 10 percent of all Twitter users are responsible for 97 percent of all tweets about politics.) For the Very Online, social apps help us affirm our own world views and reassure ourselves that we’re not like those with whom disagree. But most Americans, according to Yanna Krupnikov and John Barry Ryan at The New York Times, “just see two angry groups of people bickering over issues that may not always seem pressing or important.” But it’s probably worth it to tech companies and media organizations, as long as those petty squabbles are shared far, wide and as fast as possible.
FYI
The election is in exactly two weeks! If you’re registered to vote and will be doing so on Election Day, be sure to make a plan. (Also: Happy birthday, Kamala Harris — I hope I get to call you Madam Vice President soon!) And finally, I’ll be on Fall Break Oct 26–30 so instead of new posts, you’ll receive The Supercreator Rewind, a weeklong series of stories, conversations and essays from the archive that you may have missed (or won’t mind reading again!).
Read All About It
Brent Kendall and Rob Copeland at The Wall Street Journal on the Justice Department’s long-awaited antitrust lawsuit against Google:
The Justice Department alleged that Google, a unit of Alphabet Inc., is maintaining its status as gatekeeper to the internet through an unlawful web of exclusionary and interlocking business agreements that shut out competitors. The government alleged that Google uses billions of dollars collected from advertisements on its platform to pay for mobile-phone manufacturers, carriers and browsers, like Apple Inc.’s Safari, to maintain Google as their preset, default search engine.
Kevin Koeninger at Courthouse News Service on the grand juror on the Breonna Taylor case who spoke out against the Kentucky attorney general:
The statement made by the grand juror on Tuesday contradicts previous statements issued by the attorney general, who said the grand jury agreed with the charges his office handed down.
The juror said they felt “compelled” to act after their experience on the grand jury and Cameron’s press conference regarding the charges against Hankison.
“The grand jury did not have homicide offenses explained to them,” the statement said. “The grand jury never heard anything about those laws. Self-defense or justification was never explained either.
The anonymous juror added, “Questions were asked about additional charges and the grand jury was told there would be none because the prosecutors didn’t feel they could make them stick.”
Steven Zeitchik at The Washington Post on how CBS is remaking its police shows for the Black Lives Matter era:
When Black Lives Matter protests swept the world in June, they provoked a strong response throughout the television industry. Shows such as A&E’s “Live PD” were shelved, and many news organizations questioned their own depictions of police and communities of color. At ViacomCBS, the reaction was swift on the Viacom side of the company: Paramount TV canceled the controversial reality show “Cops,” leading to its stranding overseas, while executives at MTV, with a long history of activism, led a charge across the firm’s cable networks to stop programming for eight minutes and 46 seconds, the time it was originally thought a police officer knelt on Floyd’s neck.
Ryan Mac and Craig Silverman at BuzzFeed News on Facebook CEO Mark Zuckerberg’s intention to return to business as usual after the election:
While observers have speculated that Facebook’s new policies against potentially violent and conspiratorial content could mean that it’s turned a corner — or that the company is preparing for a Biden presidency and possible government regulation — Zuckerberg’s comments are an indication that the new rules are only stopgap measures. The 3 billion people who use at least one Facebook-owned product should not expect more rule changes after the election, Zuckerberg said.
Margaret Sullivan at The Washington Post on what Kristen Welker can learn from Savannah Guthrie about dealing with Trump:
Welker, too, must employ some strength and show some independence from the format. She can’t fact-check everything in the moment — but she can and must keep the debate from being a superspreader of disinformation. She has a strong obligation to do so if, for example, Trump says, as he has before, that young people aren’t likely to suffer much from covid-19. Or that the country is “turning the corner” on controlling the virus when in fact it’s on the rise. She should avoid cloying efforts to keep control by soothing the president as Wallace did when he promised that Trump would like the next question, as if he were offering him a cookie if he’d just be good. And she should be the one to pose questions and direct the discussion, not ceding that to either candidate.
Sarah Perez at TechCrunch on Instagram’s new badges, which enable creators to monetize their Live experiences:
To kick off the roll out of badges, Instagram says it will also temporarily match creator earnings from badge purchases during live videos, starting in November. Creators @ronnebrown and @youngezee are among those who are testing badges. The company says it’s not taking a revenue share at launch, but as it expands its test of badges it will explore revenue share in the future.
Feedback
If you’re enjoying what you’re reading, please recommend the newsletter The Supercreator. They can sign up here. If you want to share your thoughts on an item from this post or on The Supercreator in general, reply to a newsletter or email me at michael@thesupercreator.com.
Super Picks
Love Language: This London-based startup ships community care packages for everyone made up of products, services and experiences by independent black businesses.