ToxMod uses AI to bleep out abusive language in live voice chat Using machine learning AI, two MIT graduates have developed a toolset that could stop toxic online behaviour before it starts Fighting toxicity in multiplayer games remains an ongoing puzzle for developers: Ubisoft has been trying for years to reduce player toxicity in Rainbow Six Siege, Valve has tried muting and banning toxic Dota 2 players, and EA stripped thousands of assets out of its games in the hopes of promoting a more positive environment. Now, a Boston-based software developer has launched a tool that might help make meaningful progress.