Dear Visitor,

Our system has found that you are using an ad-blocking browser add-on.

We just wanted to let you know that our site content is, of course, available to you absolutely free of charge.

Our ads are the only way we have to be able to bring you the latest high-quality content, which is written by professional journalists, with the help of editors, graphic designers, and our site production and I.T. staff, as well as many other talented people who work around the clock for this site.

So, we ask you to add this site to your Ad Blocker’s "white list" or to simply disable your Ad Blocker while visiting this site.

Continue on this site freely
You are here: Home / Digital Life / When YouTube Takes Down Videos
What Causes YouTube To Remove Videos?
What Causes YouTube To Remove Videos?
By Mae Anderson Like this on Facebook Tweet this Link thison Linkedin Link this on Google Plus
YouTube often takes action against videos that violate its guidelines , and has well-established procedures for doing so. The "YouTubers" who produce videos and post them on the site aren't always happy about its decisions, but their discontent rarely leads to violence.

That may have changed April 2nd, when Nasim Aghdam -- herself a YouTuber -- shot and wounded three people at YouTube headquarters in San Bruno, California, before killing herself.

The 39-year-old told family members that she believed the company was suppressing her videos, which included segments about veganism, animal cruelty and exercise, along with glamor shots of herself. There's no evidence that YouTube was actually doing this, and the company did not immediately respond to a request for comment.

But Aghdam's father said his daughter was angry that YouTube stopped paying for videos she posted on the platform and warned police she might go to the company's headquarters. Here's a brief explanation of YouTube's video policies and the steps it can take against violators.

YouTube Rules

The tragic shooting highlights the often difficult balance that YouTube tries to strike between protecting freedom of expression and barring videos that violate its prohibitions against violence, extremism and other objectionable material.

YouTube, which is owned by Google, doesn't allow nudity, hate speech, violent behavior, harassment or bullying or impersonating others, among other things. Posting copyrighted material is also forbidden. But the site has over a billion users in 88 countries and 1 billion hours watched daily, it says, and that can be difficult to police.

"The scale of the challenge is something that's hard for anyone to wrap their minds around," said Paul Verna, a principal analyst at eMarketer. "It's a little bit like the game whack-a-mole."

Advertising Limits

YouTube has been tightening restrictions for its ad program since last year, when some large corporations began boycotting the site because their ads were turning up next to clips promoting terrorism and racism. That March, Google promised to hire more human reviewers and upgrade its technology to keep ads away from repugnant videos.

In January, YouTube changed a key benchmark for a program that lets YouTubers with smaller audiences make money from advertising that appears next to their videos. The change, the company said, aimed to strengthen "requirements for monetization" to prevent spammers and other malicious actors from exploiting the service.

The change meant that YouTubers wouldn't get paid unless they had more than 1,000 subscribers with 4,000 hours of viewing time in the past year. Previously, they only needed 10,000 lifetime views of their video channels.

A Bigger Hammer

Some famous YouTubers have gotten crosswise with the site. YouTube star Logan Paul caused a furor in January after he posted video of himself in a Japanese forest Mount Fuji near what appeared to be a body hanging from a tree. YouTube suspended the 22-year-old at the time for violating its policies.

But Paul returned and subsequently posted a video of himself using a Taser on dead rats. That spurred YouTube to temporarily suspend all ads from Paul's channel after what it called a pattern of behavior unsuitable for advertisers.

It also led YouTube to update its policies with new steps it can take against violators. It can now slap age restrictions on some material, shut off the flow of money from ads, delete particular videos and blacklist channels from its powerful recommendation and trending lists. A "strike system" can eventually lead to a channel being terminated altogether.

In February 2017, YouTube distanced itself from Felix Kjellberg, a top YouTube star known online as PewDiePie, after he made jokes construed as anti-Semitic and posted Nazi imagery in his videos.

At the time, YouTube canceled the release of the second season of Kjellberg's reality show "Scare PewDiePie" and removed the PewDiePie channel from an advertising program that brought together popular YouTube videos for advertisers to buy time on.

© 2018 Associated Press under contract with NewsEdge/Acquire Media. All rights reserved.

Image credit: iStock/Artist's Concept.

Tell Us What You Think


Granny Monster:
Posted: 2018-04-11 @ 10:32pm PT
#bringbackpetmonster He lost his channel over making a video promo for a live stream which broke no T.O.S.

Like Us on FacebookFollow Us on Twitter
© Copyright 2018 NewsFactor Network. All rights reserved. Member of Accuserve Ad Network.