The technology, called Perspective, will review comments and score them based on how similar they are to comments people said were “toxic” or likely to make them leave a conversation.
Perspective examined hundreds of thousands of comments that had been labelled as offensive by human reviewers to learn how to spot potentially abusive language.
Perspective will not decide what to do with comments it finds are potentially abusive; rather publishers will be able to flag them to their moderators or develop tools to help commenters understand the impact of what they are writing.
If you like this post then please share and post.
Call us @ 98256 18292
Mail us @ tccicoaching.com