On Internet websites that invite users to post comments, content moderation is the process of detecting contributions that are irrelevant, obscene, illegal, harmful, or insulting, in contrast to useful or informative contributions, frequently for censorship or suppression of opposing viewpoints. The purpose of content moderation is to remove or apply a warning label to problematic content or allow users to block and filter content themselves.[1]
Various types of Internet sites permit user-generated content such as comments, including Internet forums, blogs, and news sites powered by scripts such as phpBB, a Wiki, or PHP-Nuke. and etc. Depending on the site's content and intended audience, the site's administrators will decide what kinds of user comments are appropriate, then delegate the responsibility of sifting through comments to lesser moderators. Most often, they will attempt to eliminate trolling, spamming, or flaming, although this varies widely from site to site.
Major platforms use a combination of algorithmic tools, user reporting and human review.[1] Social media sites may also employ content moderators to manually inspect or remove content flagged for hate speech or other objectionable content. Other content issues include revenge porn, graphic content, child abuse material and propaganda.[1] Some websites must also make their content hospitable to advertisements.[1]
In the United States, content moderation is governed by Section 230 of the Communications Decency Act, and has seen several cases concerning the issue make it to the United States Supreme Court, such as the current Moody v. NetChoice, LLC.
^ abcdGrygiel, Jennifer; Brown, Nina (June 2019). "Are social media companies motivated to be good corporate citizens? Examination of the connection between corporate social responsibility and social media safety". Telecommunications Policy. 43 (5): 2, 3. doi:10.1016/j.telpol.2018.12.003. S2CID 158295433. Retrieved 25 May 2022.
and 23 Related for: Content moderation information
On Internet websites that invite users to post comments, contentmoderation is the process of detecting contributions that are irrelevant, obscene, illegal...
transparency, and calls to improve contentmoderation processes at Twitter. The inner workings of contentmoderation systems are not well known to the...
implemented a moderation mechanism predicated on a banned word system. This method prohibited the use of language associated with explicit content, such as...
debt collection, telemarketing, customer relationship management, contentmoderation, and communication.Its services are operated in over 300 languages...
The Telegram instant messaging service has had more than 50 million users in Iran. Following the disruptions caused by the Iranian government in the Viber...
new algorithm for contentmoderation to prevent instances of text-based simulated child pornography created by users. The moderation process involved a...
fact-checks under a post, image or video. It is a community-driven contentmoderation program, intended to provide helpful and informative context, based...
Communications Decency Act, which limits social media sites' liability for contentmoderation decisions. Twitter later banned Trump, claiming that he violated "the...
closed one of Twitter's three data centers, and largely eliminated the contentmoderation team, replacing it with the crowd-sourced fact-checking system Community...
TaskUs is an outsourcing company that handles contentmoderation for companies including Facebook and DoorDash. It was founded by Bryce Maddock and Jaspar...
engaged the services of Hive, a contentmoderation company that uses machine learning to filter postings for unacceptable content. As of June 2022[update],...
The Santa Clara Principles: On Transparency and Accountability in ContentModeration. The document sets out the following guidelines for social networks...
inadequate response to the rise of racist content. Critics argue that FetLife's laissez-faire approach to contentmoderation contributed to an unsafe environment...
Collabera as a contractor to hire content moderators, who review videos to determine if their content violates YouTube's content policies. A September 2020 lawsuit...
are difficult to automate computationally. This includes content labelling and contentmoderation. Microlabor online marketplaces allow workers globally...
explicitly asked by xHamster to wave through content whose legality was in doubt. xHamster switched its video moderation to be done by paid employees, declining...
GirlsDoPorn victims, and to have an independent party assess their contentmoderation processes for three years. In the late 1990s, German Fabian Thylmann...
from unauthorised access. Contentmoderation services involve reviewing content created by user-named user-generated content in the industry-and removing...
that Garcia may have picked this platform because of its lack of contentmoderation. Russia portal Internet portal List of social networking websites...
to answer questions from Twitter employees, discussing Twitter's contentmoderation policy, freedom of speech, potential layoffs, remote work, and "the...
Cognizant's Phoenix, Arizona, office. Cognizant employees tasked with contentmoderation for Facebook developed mental health issues, including posttraumatic...
company employs approximately 1,000 people, 80% of whom focus on contentmoderation and support. It had 2021 revenues of US$932 million. It is led by...
Substack's contentmoderation policy as "lightweight," with rules against "harassment, threats, spam, pornography, and calls for violence; moderation decisions...