It probably just gets tagged as “spicy meme someone tried to censor”, and still gets tracked and targeted the same way as the rest of the content. They know fully well what kind of post it is, a line is not going to break the algorithm
It has nothing to do with what children see, the reasoning is that advertisers don’t want their products to be associated with violence by showing their ad next to a post about violence.
I assume getting around TikTok and Meta algorithms that would automatically filter this image from public view.
Gotta make it safe and corpo friendly for the advertisers! 🙄
It probably just gets tagged as “spicy meme someone tried to censor”, and still gets tracked and targeted the same way as the rest of the content. They know fully well what kind of post it is, a line is not going to break the algorithm
Imagine being a child and seeing the word violence or ass. I would kill myself
You would what yourself? Up 'til now you’d have been shielded from that scary word!
imagine it was really fine ass, and you never saw ass that fine again
It has nothing to do with what children see, the reasoning is that advertisers don’t want their products to be associated with violence by showing their ad next to a post about violence.
Also platforms don’t want to be seen as places of free expression by the current administration. State department approved messaging only!
I would literally shake