Freitag, 18. Dezember 2020

New tech allows AI to detect toxicity in voice chat, but I think humans might be too smart for it

Toxicity in games is no fun, and in this year of our lord 2020, there seems to be a growing trend of using artificial intelligence to find and deal with toxic players. I don’t just mean in text chat either; the companies Modulate and FaceIt have both created AI that can supposedly detect toxicity in voice chat from the way that someone says something.

Part of me that feels like this is a good idea. Having a way of quickly and easily getting rid of them is great. However, I’ve heard one too many stories about AI learning to be racist, so I do wonder if it’s the best sort of tech to put in video games.

(more…)



from Rock, Paper, Shotgun https://ift.tt/3p2xIF0
via ifttt

Keine Kommentare:

Kommentar veröffentlichen