TikTok is rolling out new features to combat online bullying and harassment Mar 10, 2021 14:18 EST with 1 comment Social networking platforms have been cracking down on cyber bullying with tools that automatically detect offensive comments. Instagram, for example, launched an artificial intelligence-powered feature in 2019 designed to check for abusive comments and notify users about it before they can post them. TikTok announced today two new similar capabilities that it says are aimed at promoting "kindness" on the platform. One of these features is meant to warn you before you post an "inappropriate or unkind" comment. This comes in the form of a pop-up prompt that asks you to reconsider and edit your comment before posting it. It will also warn you if TikTok detects "words that may violate" its community guidelines. However, this does not absolutely prevent you from making offensive comments since you can post them anyway.