Content Moderation Case Study: Friendster Battles Fakesters

Content Moderation Case Study: Friendster Battles Fakesters (2003)


Fri, Apr 16th 2021 3:38pm —
Copia Institute
Summary: While the social media/social networking space today is dominated by Facebook, it’s interesting to look at how Facebook’s predecessors dealt with content moderation challenges as well. One of the earliest social networks to reach mainstream recognition was Friendster, founded by Jonathan Abrams in 2002 and launched in early 2003, gaining millions of users who signed up to connect with friends. Originally built as a dating site, it expanded quickly beyond that.
One of the first big content moderation questions that the site faced was whether or not to allow “fakesters.” As the site grew rapidly, one popular usage was to set up fake accounts -- these were accounts for completely made up fictional characters (e.g., Homer Simpson), concepts (e.g., Pure Evil), random objects (e.g., Giant Squid), or places (e.g., New Jersey). Researcher danah boyd catalogued the different types of fakesters and studied the phenomenon of fake accounts on the site.

Related Keywords

New Jersey , United States , Will Ferrell , Ron Burgandy , Los Angeles , Jonathan Abrams , Lisa Kopp , Kent Lindstrom , Safety Foundation , Dreamworks , Techdirt Team , Facebook , Copia Institute , Friendster , Content Moderation Case Study , Friendster Battles Fakesters , Homer Simpson , Pure Evil , Giant Squid , Drunk Squirrel , Hippie Jesus , While Friendster , Content Moderation , Akesters , Real Names , புதியது ஜெர்சி , ஒன்றுபட்டது மாநிலங்களில் , விருப்பம் ஃபெரெல் , ரோன் பர்கண்டி , லாஸ் ஏஞ்சல்ஸ் , பாதுகாப்பு அடித்தளம் , ட்ரீம்வொர்க்ஸ் , முகநூல் , சோபியா நிறுவனம் , ஹோமர் சிம்ப்சன் , தூய்மையானது தீமை , ராட்சத மீன் வகை , ஹிப்பி கிறிஸ்தவ சமயத்தை தோற்றுவித்தவர் ,

© 2025 Vimarsana