Banning TikTok Isn’t the Solution.

Banning TikTok Isn’t the Solution.
source: DALL-E
What’s that saying about trees and forests, again?

After a bit of a lag, the US Congress seems intent on attempting to move forward with a proposed ban of TikTok. It appears the plan will be to tuck it in with other controversial legislation, perhaps even this weekend. If you need catching up - the central idea here is that Congress is concerned TikTok could be a threat to national security. The app’s parent company, ByteDance, is Chinese and therefore (like all Chinese companies) is subject to the whims and machinations of the Chinese government. Unless you’ve been living under a rock or were quite literally born yesterday, I think you know what that means. US lawmakers’ stated concern is that China could be spying on US TikTok users and weaponizing the platform with “propaganda.”

I am skeptical of the ban being beneficial, and the premise upon which the proposed legislation was drafted. There’s no real indication that TikTok/ByteDance’s “affiliation” with the Chinese government poses a meaningful threat. Sure, there is a vague whistleblower report that the Chinese government has used a back door into TikTok for espionage. There has been a report or two that the “firewall” separating US user data (via Oracle) may not be as strong as TikTok claims. But on the whole, there’s not a whole lot of evidence to suggest that the grand design of TikTok is to convert US users into unwitting sympathizers for pro-China political perspectives. Even the US intelligence leadership has failed to come up with evidence of Chinese malfeasance - every doomsday scenario is purely hypothetical at this point.

There is a ton of evidence, however, to indicate that the majority of US lawmakers lack a fundamental understanding of 1) the business model of social media, and 2) how TikTok works. I encourage you to watch as much as you can stomach of the recent Congressional hearings featuring leaders from TikTok, Snap, Meta and more. It’s a wicked cocktail of hilarious self-owning, shocking ineptitude, and exasperating grandstanding. Not that those elements make it markedly different from the average Congressional hearing, mind you. Would you trust these people to make sound decisions at the intersection of media and technology?

The problem with all of this hullabaloo for TikTok is that it’s a classic example of not being able to see the forest for the trees. Geopolitical espionage is not the main challenge to face within social media. What is a threat, however, is the content graph model that TikTok has perfected and its effect on everything from our mental health to economic competition. And it’s not just TikTok that’s a problem in this regard - it’s the entire social media ecosystem.

how I feel when social media CEOs and Congress debate each other.

When social media began to emerge between 2008-2011, most if not all of the social media apps’ algorithms were built on what’s known as a ‘social graph.’ That means the algorithms powering the app would prioritize showing you content from other accounts you were connected to or following. Over time that began to slowly shift towards what’s known as an ‘interest graph.’ The difference is in the details, but in summary, an interest graph prioritizes surfacing content it thinks a user will like. It relies less on the connections a person has made to follow others, and more so on what content it thinks the user wants to see. The algorithm determines that content by a host of variables, but quite simply it tries to show you more of the type of content you tend to consume. TikTok has nailed this. Its algorithm is amazingly (terrifyingly?) precise. Watch a few videos and/or tap to “heart” or comment on them, and buckle up - you’re about to be served a bajillion more just like that first batch.

The algorithm sends users down rabbit holes with very minimal guardrails. Just today I watched three or four videos about “healthy snack recipes” for a new business pitch, and I’ve subsequently been deluged with all sorts of permutations of healthy food, recipes, snacking, and related content. Now that’s a relatively benign example - but imagine if someone were to start watching and engaging with, say, pro-terrorist content. They’d instantly be accelerated into a vortex of similar videos, increasing the potential for misinformation and radicalization that could lead to very ugly real-world outcomes such as harassment, bigotry, or violence.

Well, this sounds awful - we SHOULD ban TikTok!”, you might be saying to yourself. However, if TikTok were banned in the US tomorrow, users would flow over to Instagram, Snap, or X - all of whom use similar interest graphs! Social media is a copycat game. The minute someone does something new and different, the other big players race to implement it themselves. This is how Meta effectively flattened Snapchat’s rise in 2015. They launched Stories, which was… a direct ripoff of Snap Stories. This is why banning an app isn’t a solution - it’s not dealing with the root cause of social media sickness.

The entire business model of social media is predicated upon keeping users engaged - more usage means more ad dollars. It’s been said, “If you’re not paying for the product, you ARE the product.” Social media apps need to keep as many people as possible regularly engaged so that they can tout massive numbers to advertisers. They use this data to entice companies to spend their media budgets on their apps. Your data powers their profits. You can see where this is headed. Social media companies have no incentive to make editorial choices, they simply keep feeding you whatever you saw before to keep you locked in. It’s unfiltered media, akin to if The Wall Street Journal just let anyone publish in the op-ed section.

However, there’s one big difference between an app like TikTok and a publisher like The Wall Street Journal. These companies are media publishers, but they are not held to the same standard as legacy media. They bear no responsibility for their algorithms’ net effect on their users. Section 230 of the 1996 Communications Decency Act provides immunity to these companies from the content posted by users. The same users that they monetize by algorithmically elevating certain content to increase advertising revenues. The platforms claim that they can’t be liable because it would be impossible for them to effectively, consistently enforce moderation - but that’s quite disingenuous to say these companies already do perform moderation - they use an algorithm to elevate content and determine what each user will (or won’t) see.

The problem lawmakers should be focused on is the lack of responsibility social media companies bear relative to the immense power they wield in the market and upon our society. The current setup stifles competition from new challengers (see: Snapchat vs. Instagram) and enhances social pressures on the most impressionable users. There is a pretty strong correlation between increased social media usage and declining mental health among young people. It’s reached a point where the latest generations are actively trying to moderate their usage.

teen suicide rates have skyrocketed since 2007... gee, I wonder what happened then?

If US lawmakers want to be productive, they would focus on enacting legislation that mandates social media companies bear the responsibility for the net effect of their algorithms. Congress should define these companies as what they are: media. It’s right there in the categorization - social media.

To be clear, I think there are obvious and significant benefits that social media has provided to society. We are more aware, knowledgeable, and engaged with the wider world because of real-time news and conversation within social networks. New economic opportunities have been created within these apps, most notably an entirely new creator economy. Moments and makers of pop culture that would’ve been ignored or passed quickly without much impact can now blow up and dominate the discourse, spurring fresh creative opportunities in their wake. I have spent nearly twenty years in digital marketing, and for most of that, I have either led or had direct involvement in social media strategy and management for big brands. I think it is a rich, valuable medium. I also think the industry as a whole is being given too much-unwarranted leeway.

As a dear Uncle once said, “With great power comes great responsibility.” Right now, companies like ByteDance (and Meta, Google, Snap, and X…) hold massive amounts of power without shouldering much (if any) of the responsibility for how it’s wielded. That should change, and it would improve the ecosystem for everyone. Users would receive a better experience, with more opportunities for discovery within the apps. Competition would increase as the current big players would have to invest in risk management, allowing for new and smaller platforms to win on experience curation. The entrenched companies would probably be forced to look for new revenue streams beyond advertising. One I’m quite fond of - and have written about previously - is subscription models. This would be a net benefit to advertisers as we enter into a new era of a “niche-fied” internet. The healthiest audience is one that’s opting in with its wallet.

I don’t hold out much hope that the current iteration of our government (or any shortly) will wise up. Our representatives seem more focused on hypothetical problems (or even made-up ones like censorship; don’t get me started on Congressional reps crying “First Amendment!” About a private company enforcing its terms of service. Any elected leader who does so should be expelled from their position and forced to take a test on the US Constitution. Alas, I digress). I do hope that marketers like myself will continue to advocate for clear, common-sense solutions to help improve the digital media experience. The health of our economy, our culture, and our discourse depend upon it.