How aspiring influencers are forced to fight the algorithm


There are two ways to understand the impact of content moderation and the algorithms that enforce these rules: by relying on what the platform says and by asking the creators themselves. In Tyler’s case, TikTok apologized and blamed an automatic filter that was set up to highlight words related to hate speech, but that was apparently unable to understand the context.

Brooke Erin Duffyan associate professor at Cornell University, collaborated with a graduate student Colten Meisner to interview 30 creators on TikTok, Instagram, Twitch, YouTube and Twitter around the time Tyler’s video went viral. They wanted to know how creators, especially those from marginalized groups, navigate the algorithms and moderation practices of the platforms they use.

What they found: Creators invest a lot of work into understanding the algorithms that shape their experiences and relationships on these platforms. Since many creators use multiple platforms, they have to learn the hidden rules for each platform. Some creators change their entire approach to producing and promoting content in response to the algorithmic and moderation biases they encounter.

Below is our conversation with Duffy about her upcoming research (edited and condensed for clarity).

Creators have long discussed how algorithms and moderation affect their visibility on the platforms that made them famous. So what surprised you the most while doing these interviews?

We felt that creators’ experiences are shaped by their understanding of the algorithm, but after the interviews, we really started to see how profound [this impact] is in their daily lives and work… the amount of time, energy and attention they put into learning about these algorithms, by investing in them. They have such a critical awareness that these algorithms are considered unequal. Despite this, they still invest all this energy in the hope of understanding them. It just really draws attention to the one-sided nature of the creative economy.

How often do creators think about the possibility of being censored or that their content will not reach their audience due to algorithmic suppression or moderation practices?

I think it fundamentally structures their content creation process and also their content promotion process. These algorithms change at will; there is no insight. In many cases there is no direct communication from the platforms. And this has a fundamental impact not only on your experience, but also on your income.