Particularly for those familiar with a certain type of online personality, there has for some time been no real question about Elon Musk’s political ideology. Sure, he made his name building electric vehicles, earning the affection of the (wealthier part of the) environmental wing of the Democratic Party. But he was also demonstrably what’s called an “edgelord,” a guy who likes to stir up controversy and act out nihilistic behavior. In practice, that generally meant embracing and elevating Roganistic (as in Joe) contrarianism and Greenwaldian (as in Glenn) anti-establishmentism.
Then he bought Twitter. He did so ostensibly because he was worried about the pervasiveness of the “woke mind virus,” his disparaging term for those elevating (or just recognizing) racial inequalities, particularly in American power structures. He bought it, too, because he fairly obviously wanted more attention and appreciation. So he bought Twitter (later renaming it X for inexplicable reasons) and physically dragged the app’s spotlight so that he sat in its center. It cost him $44 billion, but he remains X’s main character.
This means that when he casually replies to fringe-right voices (as he does often) or offers his approval for wildly antisemitic conspiracy theories (as he did Wednesday night), people notice. People react, often negatively.
Let’s leave Musk for a moment and consider one of his company’s competitors, TikTok.
TikTok, too, has an ownership structure that has attracted negative attention. The app is a subsidiary of ByteDance, a company based in China and linked to the Chinese government. For years, that has made the company a target of U.S. government criticism and scrutiny, from both sides of the aisle.
In the aftermath of the brutal Hamas attack in Israel, for example, TikTok came under fire for allegedly amplifying pro-Palestinian content, a shift that the app’s critics identify as downstream from Chinese national interests. More recently, an anti-American screed written by Osama bin Laden years ago has gotten a lot of play on TikTok, again raising questions about the app’s use as a vehicle for hostile propaganda.
But those ties to China also invite (at times opportunistic) overreaction. A Washington Post analysis, for example, found that other sites, including Meta-owned Instagram, saw a similar divergence in interest between posts supportive of the Palestinian cause relative to Israel. The Post’s Drew Harwell points to another likely trigger for the relative prominence of pro-Palestinian content: TikTok (and Instagram) skew young, and young people are more sympathetic to the Palestinian position, according to multiple polls including one released Wednesday by Fox News.
This divergence in age is significant, as new data from the Pew Research Center demonstrates. U.S. adults are most likely to point to Facebook as a source from which they regularly get news, but TikTok has passed X. In other words, TikTok is now a more common source for social-media-based news than the app formerly known as Twitter.
You can see the age divide in Pew’s data: In the audience that uses TikTok for news, there are more than twice as many users under 30 as users 50 and older. That split is similar for Instagram, narrower for Twitter and flipped for Facebook.
TikTok users also are increasingly likely to use the app for news consumption. It is now about as common for users of TikTok to use it regularly for news as it is for users of Facebook to do so.
This overlaps with Musk’s “woke mind virus” nonsense: TikTok and Instagram have younger user bases that have more-liberal political views than older Americans, so those views are attributed to the social media apps. That’s particularly useful in the case of TikTok, with its Chinese ownership: Clearly, this is simply a function of dopey young people being fed dishonest garbage as they stare, unblinking, into their phones. It’s of a piece with the right’s broader “they must be being brainwashed?” excuse for young people holding more left-leaning values.
The bin Laden example shows how this is exploited. It is the case that a number of users are sharing weird, pro-bin Laden content on TikTok. It is also the case that TikTok’s success (as the journalist Ryan Broderick recently observed) lies not in boosting things to go viral to everyone but, instead, to drill down into specific niches of interest. It’s unlikely that most TikTok users are encountering this rhetoric, but there are enough people encountering it that it can be presented as a trend. Once it is presented as a trend, of course, it draws new, expanded attention.
Part of our tendency to overstate the scale of virality is a function of our inability to adjust our comprehension of social interactions for the internet age. In the same way that our evolution into modern humans left us far better able to differentiate between 10 and 20 than between 1 billion and 1 trillion — a difference that was never important before about a century ago — we have difficulty differentiating between a clamor coming from 100 people online and one coming from 100,000. It’s easy to pile up a few examples of something online that strikes us as a cacophony, but it usually isn’t.
This question of how social media tools shape platform usage actually is the important one, more important than the ownership question. Yes, it’s important that Musk constantly regurgitates misinformation or expresses overtly antisemitic rhetoric, particularly given his follower base on Twitter/X and his insistence that his presence on the app is robust. But it’s more important that he reshaped the app to downplay trusted sources of information and to reward — including monetarily — controversy. Musk not only amplifies false claims, but he also helped reorient X generally to encourage the spread of false, often politically slanted, information.
He frames this as an issue of “free speech,” which is sort of true in the abstract, in the sense that free speech mandates letting anyone holler anything they want to, however stupid or dishonest. But he hasn’t just given a platform to that hollering, but he also has made it easier for the hollering to drown out other things. That has encouraged more hollering and louder hollering, and Musk clearly relishes it.
Part of his impetus for purchasing Twitter was that he accepted the argument that the social media site had unfairly limited speech in a way intended to muffle conservative voices. This was a particularly common line of argument in 2020, leveraging often anecdotal examples of Twitter’s tamping down on edge cases of election denial or coronavirus misinformation. (The Hunter Biden laptop thing certainly didn’t help.) But this was only the most recent iteration of hostility to social media companies that began after the 2016 election.
That election crystallized two aspects of social media use that were seen as unhealthy: the spread of misinformation (including, to an overstated degree, by Russian actors) and the increase in often politically loaded abuse. Apps like Twitter and Facebook created systems focused on tamping down those behaviors. This affected a lot of conservative voices (though not exclusively) and, in part because Twitter and Meta were based in hyper-blue California, some conservatives decided that the muting was politically targeted. There’s no evidence it was; in fact, evidence suggests that, thanks in part to complaints, right-wing voices are given more latitude.
You see how this mirrors the TikTok thing: Disagreement with the perceived politics of ownership leads to an assumption about how the application steers its users. Musk’s obvious politics and obvious manipulation of his platform earn shrugs from allies who find TikTok infuriating.
It’s fair to wonder which social media platform is most vulnerable to public pressure, TikTok or X. The former is eager to demonstrate its independence, given ongoing pressure from federal regulators and politicians. X, privately owned, does what it wants.
There’s a useful third company to throw into the mix here: Meta. It, too, is publicly owned, facing less (but not zero) consternation about the politics of its leadership. It, like Twitter, implemented post-2016 controls aimed at limiting misinformation.
Recently, though, it scaled one of those back: Advertisers can now make claims about elections being stolen and rigged, something that was banned after the 2020 election. Pressure from the right appears to have outweighed fears of serving as a platform for misinformation — or, perhaps, the challenge of filtering out false information offered diminishing corporate returns. Either way, those elevating or exploiting false claims about 2020 benefit.
This is a manifestation of the desired outcome since Meta and other companies implemented filters to root out misinformation — getting misinformation and propaganda back into the mix, with all of the accompanying political and economic benefits. Focusing on TikTok for this problem given its ownership is intentionally missing the forest for the trees.
Author: Joy Roman
Last Updated: 1703248681
Views: 1024
Rating: 4.1 / 5 (81 voted)
Reviews: 89% of readers found this page helpful
Name: Joy Roman
Birthday: 1993-09-17
Address: 45609 April Prairie, South David, UT 93144
Phone: +3742990908349296
Job: Art Director
Hobby: Rock Climbing, Camping, Playing Chess, Telescope Building, Survival Skills, Role-Playing Games, Rowing
Introduction: My name is Joy Roman, I am a intrepid, treasured, multicolored, receptive, Determined, artistic, irreplaceable person who loves writing and wants to share my knowledge and understanding with you.