Measure #1: Emulate Best Practices

Comparing the political ad strategies by US tech companies yields many broad similarities and specific differences in approaches. At a high level, the following is a list of best practices that TikTok should emulate:

We have also seen that the exact implementation of these practices can vary widely. On the whole, it is clear that Facebook's system is the most sophisticated and comprehensive, followed by Google, Twitter (a distant third), and Snapchat (an even more distant fourth). In the abstract, TikTok should obviously emulate Facebook, rather than, say, Snapchat, in terms of the sophistication of its ad policy.

However, Facebook's system costs significantly more and is justified by the huge amount of negative media attention it received for its alleged role in the 2016 elections and Facebook's much larger revenues (at least compared to Twitter and Snapchat).

As Mark Zuckerberg noted in the leaked audio of Facebook's internal Q&A session:

It’s why Twitter can’t do as good of a job as we can. I mean, they face, qualitatively, the same types of issues. But they can’t put in the investment. Our investment on safety is bigger than the whole revenue of their company.

As such, how much TikTok should invest into these best practice is similarly a function of how much TikTok expects to grow and how much scrutiny it expects to receive. I am not in a position to judge how much of the experience and infrastructure devoted to compliance within China can be transposed to this international context.

From a strategic standpoint, much of these investments into ad/content policy are fixed costs that can be spread across the user base and even across different products. Large companies with scale can thus afford to invest more and, in this sense, growth and investments in safety are mutually reinforcing.

Measure #2: Internal Alignment and External Transparency

Any political ad policy will inevitably have many gray areas and tend towards complexity. (This is true of ad policy in general and content moderation as well.) First, such a policy must be tailored to suit the needs of specific countries/territories and comply with local laws. As such, internal inconsistency within a given country/territory and cross-sectional inconsistency across different countries/territories should be expected. Second, many of these norms are evolving and we can expect temporal inconsistency as well. This complexity can only be managed, not eliminated.

In principle, internal alignment (across different teams at ByteDance/TikTok) and external transparency (with advertisers, users, and potentially regulators) could mitigate this complexity.

Internal Alignment

Disclaimer: I am writing this section without much specific knowledge of how TikTok is currently organized. As such, much of the following may already be in practice or, on the contrary, be unfeasible in light of existing arrangements.

Internally, TikTok should try to combat complexity with simplicity and transparency in terms of: