A recent blog post by Sheeraz Raza identified how Facebook is taking a more active step in identifying extremism. The social media platform has apparently developed an algorithm to identify people who have viewed “extremist” content. They receive a message, and their friends receive a message:
Twitter and other users are not happy with this, as it insinuates that Facebook is judging what is or isn’t extremist content (and, what is or isn’t “extremist”). Also, the question goes to what the social media platform may be doing with this information–passing it off to law enforcement?
You be the judge.