Hello,
Allow me to preface first with a few things before we get into this. I am a YouTuber. I am posting this anonymously because I feel the information should get out, as it could benefit creators with insight, but I also am concerned by the potential repercussions sharing these things could bring about.
I was contacted some time ago by a person who was a viewer of my work. This person (henceforth known as “my contact”) worked for a company that ended up being connected to Google/YouTube. I will not provide too many details about this person in order to protect them, but they became my “eyes and ears” behind the scenes on what was going on with YouTube’s new policies and policy enforcement (things they’ve refused to be upfront or direct about). In the midst of “codes” and creators trying to further figure out how to tell if their content is at risk, I felt the need to release this with my contact’s blessing.
Google, some months ago, put out a job posting which required a number of different people to work on one project. Google claimed in the posting that they needed a third party company to work as “web search evaluators.” My contact worked for the unnamed company that put in for the job and was accepted by Google. My contact didn’t know what to expect from the job, but was told to sign a Non-Disclosure Agreement in order to be a part of it. My contact accepted and signed the NDA. That is when things changed.
Once the NDA was signed, my contact found that they weren’t working as a “web search evaluator.” They were demonetizing YouTube videos. They would be given YouTube videos to review and had a checklist of sorts to go through to be sure the video fit (or didn’t fit) certain criteria. You can see screenshots from my contact’s end in this post via Imgur.
One of the most important things to take away from this is that if the person reviewing the video wouldn’t feel comfortable watching the video in public, it should automatically be demonetized. My contact stated that the company told them that if they were on the fence about a video and didn’t really know if it violated any of YouTube’s new “rules,” to demonetize the video anyway. Also, if the reviewer doesn’t find anything listed that’s wrong with the video, they are allowed to insert their own personal belief on something that is sensitive or inappropriate and can have the video demonetized that way, as well.
Here are the screenshots from the third party company’s viewpoint:
DEMONETIZATION LEAKED SCREENSHOTS: https://imgur.com/a/uTLTS
This will hopefully provide insight for creators who are confused on how YouTube decides what is okay and what isn’t. Truthfully, it’s not really YouTube who is deciding it. It’s the employees at the company they hired to review the videos. It’s my belief that anytime you request a manual review, these people are the ones conducting it, and they can demonetize for whatever reason they see fit, even if they personally disagree with the content or message. This could explain why many videos that don’t violate any known criteria can still be demonetized by manual review. Because the reviewer thought it should be based on their own reasoning.
Now, as for the secret meeting, which may or may not tie into this whole issue (you decide if you think it does)...
This all started to take effect after YouTube held a private meeting with select creators. Obviously the media issues gave this all a violent shove into reality, but YouTube had been ramping up to make changes before this all came to pass. A different contact of mine was involved in the secret meeting with YouTube, which took place around mid 2016 at a Google office in Los Angeles. The meeting had one purpose: to discuss what should be done with unwanted creators on their platform. Some unwanted channel names that were mentioned were Leafyishere, GradeAUnderA, Keemstar/DramaAlert, Scarce, and Onision.
In attendance were a number of higher ranking individuals from a few different departments at YouTube. CEO Susan Wojcicki was not in attendance.
All the creators in attendance were made to sign NDAs. The creators were asked what they felt should be done about the more toxic channels on YouTube. There was no conclusion from YouTube’s end, but it was agreed upon that no censorship or channel deletion should occur and that something else should be done. Months later, demonetization began sweeping the platform and hindering the growth of countless channels. Now many of us know the reasons for this. Ad companies got spooked by offensive content that was monetized. However, I am unsure as to whether or not the demonetization is enforced as strictly as it is to also help drive certain unwanted players off the field, in keeping with what concerns were discussed at the meeting.
I hope this information proves useful to some of you. That’s all for now. If we can get this information around, I feel it will at least help creators get a better grip on what they’re up against. Share the screenshots, make videos discussing it, or simply observe, whatever you feel is best for you.
I wish you well,
-TD
Roosh said:I started a Twitter account to list all Limited State videos:
https://twitter.com/YoutubeSandbox
This one has Hitler imagery:
If you encounter one, send me a PM on the forum or a DM through that Twitter account (faster).
Cr33pin said:
A lot of Youtubers videos talking about the Logan Paul inccident were deleted for complete bs reasons
www.bloomberg.com/amp/news/articles/2018-03-21/youtube-bans-firearm-sales-and-how-to-videos-prompting-backlash
March 21, 2018, 1:20 PM EDT
Updated on March 21, 2018, 4:39 PM EDT
Google’s new policy prohibits promotion of guns, bump stocks
At least one video gun blogger ditched YouTube for PornHub
YouTube, a popular media site for firearms enthusiasts, this week quietly introduced tighter restrictions on videos involving weapons, becoming the latest battleground in the U.S. gun-control debate.
YouTube will ban videos that promote or link to websites selling firearms and accessories, including bump stocks, which allow a semi-automatic rifle to fire faster. Additionally, YouTube said it will prohibit videos with instructions on how to assemble firearms. The video site, owned by Alphabet Inc.’s Google, has faced intense criticism for hosting videos about guns, bombs and other deadly weapons.
For many gun-rights supporters, YouTube has been a haven. A current search on the site for “how to build a gun” yields 25 million results, though that includes items such as toys. At least one producer of gun videos saw its page suspended on Tuesday. Another channel opted to move its videos to an adult-content site, saying that will offer more freedom than YouTube.
We routinely make updates and adjustments to our enforcement guidelines across all of our policies,” a YouTube spokeswoman said in a statement. “While we’ve long prohibited the sale of firearms, we recently notified creators of updates we will be making around content promoting the sale or manufacture of firearms and their accessories.”
YouTube has placed greater restrictions on content several times in the past year, responding to a series of issues with inappropriate and offensive videos. Most of those changes involved pulling ads from categories of videos. Google is more reluctant to remove entire videos from YouTube, but has been willing to do so with terrorism-related content.
The National Shooting Sports Foundation, a gun industry lobbying group, called YouTube’s new policy “worrisome.”
“We suspect it will be interpreted to block much more content than the stated goal of firearms and certain accessory sales,” the foundation said in a statement. “We see the real potential for the blocking of educational content that serves instructional, skill-building and even safety purposes. Much like Facebook, YouTube now acts as a virtual public square. The exercise of what amounts to censorship, then, can legitimately be viewed as the stifling of commercial free speech.”
The firearms decision comes days before Saturday’s March For Our Lives, a rally organized by survivors of the Feb. 14 school shooting in Parkland, Florida, that left 17 dead.
The new YouTube policies will be enforced starting in April, but at least two video bloggers have already been affected. Spike’s Tactical, a firearms company, said in a post on Facebook that it was suspended from YouTube due to “repeated or severe violations” of the video platform’s guidelines.
“Well, since we’ve melted some snowflakes on YouTube and got banned, might as well set IG and FB on fire!,” Spike’s wrote on Facebook, where it has over 111,000 followers, referring to the social network and its Instagram app. A YouTube spokeswoman said the channel has been reinstated after it was mistakenly removed.
InRange TV, another channel devoted to firearms, wrote on its Facebook page that it would begin uploading videos to PornHub, an adult content website.
“YouTube’s newly released released vague and one-sided firearms policy makes it abundantly clear that YouTube cannot be counted upon to be a safe harbor for a wide variety of views and subject matter,” InRange TV wrote. “PornHub has a history of being a proactive voice in the online community, as well as operating a resilient and robust video streaming platform.” PornHub didn’t immediately return a request for comment on the matter.
Last month, gun control activists escalated the pressure on tech giants for giving a platform to the National Rifle Association. A flurry of businesses cut ties with the pro-gun group after the deadly Parkland school shooting. Companies with streaming services, such as Amazon.com Inc., Apple Inc. and YouTube, declined to remove the NRA channel.
I don't know what apple carplay is, but are you signed onto the same youtube account as your phone when you use it?Kona said:I've been trying to tell people this but no one listens:
When you do YouTube from Apple carplay it censors all kinds of shit.
Roost and Davis J Aurinu among them. Nothing makes traffic more bearable than listening to aurini/ Matt Forney banter. But no dice.
I tried it on my phone and on my truck at the exact same time. Nothing on the carplay, but came through on the phone.
Aloha!
The word “leak” is right. Our sense of control over our own destinies is being challenged by these leaks. Giant internet platforms are poisoning the commons. They’ve automated it. Take a non-Facebook case: YouTube. It has users who love conspiracy videos, and YouTube takes that love as a sign that more and more people would love those videos, too. Love all around! In February an ex-employee tweeted: “The algorithm I worked on at Google recommended [InfoWars personality and lunatic conspiracy-theory purveyor] Alex Jones’ videos more than 15,000,000,000 times, to some of the most vulnerable people in the nation.”
The head of YouTube, Susan Wojcicki, recently told a crowd at SXSW that YouTube would start posting Wikipedia’s explanatory text next to conspiracy videos (like those calling a teen who survived the Parkland, Fla., shooting a “crisis actor”). Google apparently didn’t tell Wikipedia about this plan.