Content Moderation Case Study: Decentralized Social Media Pl

Content Moderation Case Study: Decentralized Social Media Platform Mastodon Deals With An Influx Of Gab Users (2019)


Wed, Mar 3rd 2021 3:41pm —
Copia Institute
Summary: Formed as a more decentralized alternative to Twitter that allowed users to more directly moderate the content they wanted to see, Mastodon has experienced slow, but steady, growth since its inception in 2016.
Unlike other social media networks, Mastodon is built on open-source software and each "instance" (server node) of the network is operated by users. These separate "instances" can be connected with others via Mastodon's interlinked "fediverse." Or they can remain independent, creating a completely siloed version of Mastodon that has no connection with the service's larger "fediverse."
This puts a lot of power in the hands of the individuals who operate each instance: they can set their own rules, moderate content directly, and prevent anything the "instance" and its users find undesirable from appearing on their servers. But the larger "fediverse" -- with its combined user base -- poses moderation problems that can't be handled as easily as those presenting themselves on independent "instances." The connected "fediverse" allows instances to interact with each other, allowing unwanted content to appear on servers that are trying to steer clear of it.

Related Keywords

, Google , Safety Foundation , Twitter , Techdirt Team , Copia Institute , Content Moderation Case Study , Mastodon Server , Eugen Rochko , Play Store , Content Moderation , Decentralized , Federated Social Media , Instances , Gab , Mastodon , கூகிள் , பாதுகாப்பு அடித்தளம் , ட்விட்டர் , சோபியா நிறுவனம் , மாஸ்டோடன் சேவையகம் , விளையாடு கடை , கூட்டமைப்பு சமூக மீடியா , கப் , மாஸ்டோடன் ,

© 2025 Vimarsana