Do Social Networks Have an Obligation to Remove Hateful Content?
Americans and the World at large have watched and been stunned as more and more groups have rioted and killed over a video that was posted to YouTube.
Every social network and user content community must face the same basic question. At what point is freedom of speech no longer applicable? When a user or group of users decide to push the lines of social norms, at what point should moderators step in? Do you agree or disagree with Google's decision?
DISCLOSURE: Many of the links in the article above, and throughout this site, are affiliate links. While there's no additional cost to you, any purchases made via those links may earn me a commission. Rest assured, only products and services which have been rigorously tried and tested are reviewed, and those reviews are always thorough and honest. If you benefited from my review and have a genuine interest in the linked product, your use of the affiliate link is appreciated and allows me to continue writing these kinds of helpful articles. Current examples include Agorapulse, Tailwind, Wishpond or SEMrush. Please also note that I am employed by SiteSell as their Chief Marketing Officer and am fully authorized to share product and company information from extensive personal experience.
By Mike Allton, Content Marketing Practitioner
Mike is a Content Marketing Practitioner - a title he invented to represent his holistic approach to content marketing that leverages blogging, social media, email marketing and SEO to drive traffic, generate leads, and convert those leads into sales. He is an award-winning Blogger, Speaker, and Author at The Social Media Hat, and Brand Evangelist at Agorapulse (formerly CMO at SiteSell).
As Brand Evangelist, Mike works directly with other social media educators, influencers, agencies and brands to explore and develop profitable relationships with Agorapulse.Follow @Mike_Allton