At RedTube, nothing is more important than the safety of our community. Our core values such as inclusivity, freedom of expression and privacy are only possible when our platform is trusted by our users. This is why we have always been committed to eliminating illegal content, including non-consensual material and child sexual abuse material (CSAM). Every online platform has the moral responsibility to join this fight, and it requires collective action and constant vigilance.
Over the years, we have put in place robust measures to protect our platform from non-consensual content. We are constantly improving our trust and safety policy to better flag, remove, review and report illegal material. While leading non-profit and advocacy groups recognize that our efforts have been effective, we know there is more to do.
In April 2020, we retained outside experts to independently review our compliance program and make recommendations that focus on eliminating all illegal content and achieving a “best-in-class” program that sets the standard for the technology industry.
Today, we are taking major steps to further protect our community. Going forward, we will only allow properly identified users to upload content. We have banned downloads. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report. Full details on our expanded policies can be found below.
If you wish to report any content that violates our terms of service, including CSAM or other illegal content, please click this link.
1. Verified Uploaders Only
Effective immediately, only content partners and people within the Model Program will be able to upload content to RedTube. In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification protocol.
2. Banning Downloads
Effective immediately, we have removed the ability for users to download content from RedTube, with the exception of paid downloads within the verified Model Program. In tandem with our fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.
3. Expanded Moderation
We have worked to create comprehensive measures that help protect our community from illegal content. In recent months we deployed an additional layer of moderation. The newly established “Red Team” will be dedicated solely to self-auditing the platform for potentially illegal material. The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service. Additionally, while the list of banned keywords on RedTube is already extensive, we will continue to identify additional keywords for removal on an ongoing basis. We will also regularly monitor search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place. RedTube’s current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies. These technologies include:
- CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online
- PhotoDNA, Microsoft’s technology that aids in finding and removing known images of child exploitation
- Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.
If a user encounters a piece of content they think may violate the Terms of Service, we encourage them to immediately flag the video or fill out the Content Removal Request Form, which is linked on every page.
Our policy is to immediately disable any content reported in the Content Removal Request Form for review.
4. NCMEC Partnership
We voluntarily partnered with the National Center for Missing & Exploited Children (NCMEC) in order to transparently report and limit incidents of CSAM on our platform.
5. Independent Review
As part of our commitment, in April 2020 we hired the law firm of Kaplan Hecker & Fink LLP to conduct an independent review of our content compliance function, with a focus on meeting legal standards and eliminating all non-consensual content, CSAM and any other content uploaded without the meaningful consent of all parties. We requested that the goal of the independent review be to identify the requisite steps to achieve a “best-in-class” content compliance program that sets the standard for the technology industry. Kaplan Hecker & Fink LLP is continuing its review, but has already identified and categorized a comprehensive inventory of remedial recommendations, supported by dozens of additional sub-recommendations, in addition to the steps identified above, based on an evaluation and assessment of our current policies and practices. Kaplan Hecker & Fink LLP is soliciting information to assist with its review and in developing recommendations regarding our compliance policies and procedures.
RT