Do Social Networks Have an Obligation to Remove Hateful Content?
Americans and the World at large have watched and been stunned as more and more groups have rioted and killed over a video that was posted to YouTube.
Every social network and user content community must face the same basic question. At what point is freedom of speech no longer applicable? When a user or group of users decide to push the lines of social norms, at what point should moderators step in? Do you agree or disagree with Google's decision?
DISCLOSURE: Some links in the article above, and throughout this site, may be affiliate links. While there's no additional cost to you, purchases made via those links may earn me a commission. Only products and services which have been tried and tested are reviewed, and those reviews are always thorough and honest. If you benefited from my review and have a genuine interest in the linked product, your use of the affiliate link is appreciated and allows me to continue writing these kinds of helpful articles.
By Mike Allton, Content Marketing Practitioner
Mike is a Content Marketing Practitioner - a title he invented to represent his holistic approach to content marketing that leverages blogging, social media, email marketing and SEO to drive traffic, generate leads, and convert those leads into sales. He is an award-winning Blogger, Speaker, and Author at The Social Media Hat, and Brand Evangelist at Agorapulse (formerly CMO at SiteSell).
As Brand Evangelist, Mike works directly with other social media educators, influencers, agencies and brands to explore and develop profitable relationships with Agorapulse.Follow @Mike_Allton