Intel's New Anti-Toxicity Voice Chat AI
Intel is developing a new anti-toxicity software called "Bleep" aimed at gamers to filter out harmful language before it reaches the users ears. The software turns incoming voice into a transcript. It then detects types of profanity including misogyny, name-calling, and racial hate speech. The profanity filter can be toggled on and off as well as be adjusted with a slider to redact the different levels of toxicity removed from voice chat.
Many games already have some type of profanity filter that can be toggled. But these often are only applied to text chat, leaving voice chat completely unmonitored. According to the Anti-Defamation League, around 22% of players will quit a game because of harassment. Developers of the "Bleep" software believe that it will improve the environment of a players game in keeping it clean especially for younger gamers.