YouTube's Changes Reduce Watch Time of Borderline Content and Harmful Misinformation by 70%

Mick West

Administrator
Staff member

The Four Rs of Responsibility, Part 2: Raising authoritative content and reducing borderline content and harmful misinformation
Tuesday, December 3, 2019

... So over the past couple of years, we've been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation. And we are already seeing great progress. Authoritative news is thriving on our site. And since January 2019, we’ve launched over 30 different changes to reduce recommendations of borderline content and harmful misinformation. The result is a 70% average drop in watch time of this content coming from non-subscribed recommendations in the U.S.
Content from External Source
The above block post from YouTube is the first actual number I've seen on the effectiveness of their recent changes clamping down on misinformation. And it's a pretty significant number, a 70% reduction is huge. The type of content that is being reduced is:
borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
Content from External Source
In other words, just the type of disinformation that Metabunk is fighting against. So this is a good thing, mostly.

Why "mostly"? I think it's a net good, in that it will reduce the spread of this disinformation. But at an individual level, it might have a type of backfire effect, where people already convinced of a slew of conspiracy theories will see this as evidence of a cover-up of those conspiracy theories and hence become even more convinced.

There are also some ethical and slippery-slope concerns. Who decides what is "borderline"? Sure, most people agree that Flat-Earth is just nonsense, but fewer people agree chemtrails are nonsense, and quite a significant percentage of people actually believe "blatantly false claims about historic events like 9/11." What about when we get into political debunking? Does YouTube get to suppress videos that make borderline political claims that 30% of the people in the county think are true? Probably not (yet), but the line is fuzzy.

Still, I think the reduction in exposure to the more obviously wrong conspiracy theories is a good development. There's a chunk of the population who are vulnerable to them, and if they get sucked into the conspiracy theory rabbit hole, it makes it very easy to sway them to make life and voting decisions based on misinformation. Stopping people from becoming flat-earthers isn't automatically going to make them happy and productive members of society, but it's a solid step in the right direction.
 
Last edited:
Does YouTube get to suppress videos that make borderline political claims that 30% of the people in the county think are true? Probably not (yet), but the line is fuzzy.

Only 30% of the people in the country think that Epstein killed himself, and I doubt that YouTube would suppress that theory.
http://www.rasmussenreports.com/pub...rder_more_likely_than_suicide_in_epstein_case
The latest Rasmussen Reports national telephone and online survey finds that only 29% of American Adults believe Epstein actually committed suicide while in jail. Forty-two percent (42%) think Epstein was murdered to prevent him from testifying against powerful people with whom he associated. A sizable 29% are undecided.
Content from External Source
I think "borderline content" refers to "Content that comes close to — but doesn’t quite cross the line of — violating our Community Guidelines," so it doesn't get deleted but does get suppressed.
 
I think it may help prevent newcomers (as there are always newcomers) from tripping onto the Rabbit Hole.
Correct, this policy of YouTube may reinforce "already believer" fears into more "proof of a conspiracy".
So I would hope that YT would list and explain the reasons for removing specific "disinformation" or "borderline" content.
 
YouTube's blog post contains this half-truth:
For example, when people watch videos that encourage viewers to skip the MMR vaccine, we show information panels to provide more basic scientific context, linking to third-party sources.
Content from External Source
True, but YouTube also shows information panels under pro-vaccine videos and under NASA videos about Apollo 11. Presumably, YouTube doesn't suppress those videos but leaves the information panel there anyway for the conspiracy theorists. It's a little annoying.
 
Blatant "suppression" would be a conspiracy, if people did not have an alternate explanation available to questionable theories.
 
If all the "moon landing was a hoax" vids were suppressed or deleted (example), how could inquisitive people determine the truth ? All viewpoints need to be expressed, for many people to be convinced one way or another.
 
I looked up "Flat Earth" on YouTube just to see what it'll recommend if I watch a Flat Earther video, but none of the search results were Flat Earther videos.
So I went to Eric Dubay's channel, and his recent video is titled "YouTube's Censorship Exposed"
Last year after uploading the video 'Level Earth | Fact vs Theory' to my YouTube channel, substantial views and likes were gained very quickly, but YouTube proceeded to delete the views, the likes, and the ability to like the video, all in an attempt to censor it from search results. Shortly afterwards, they broke their own TOS and deleted my entire channel (for the 3rd time, I am now on my 4th YouTube channel).
Content from External Source
None of the recommended videos were Flat Earther or conspiracy theory videos.
 
If I look up "Epstein murder" on YouTube, I get mainstream media and tabloid videos like "Jeffrey Epstein Death: Medical examiner says prison death was homicide" and "Epstein accuser on his death: I'm absolutely suspicious of it."
 
If I look up "White Helmets" on YouTube in a private window, I get almost exclusively positive coverage from Western media, Turkey, and Al Jazeera (Qatar). If I look up "White Helmets RT", I get all the Russian propaganda calling them terrorists.

If I look up "Белые каски" in Russian, I get a lot of Russian state media and some Radio Free Europe videos in Russian.
 
Last edited:
I looked up "Flat Earth" on YouTube just to see what it'll recommend if I watch a Flat Earther video, but none of the search results were Flat Earther videos.
So I went to Eric Dubay's channel, and his recent video is titled "YouTube's Censorship Exposed"
Last year after uploading the video 'Level Earth | Fact vs Theory' to my YouTube channel, substantial views and likes were gained very quickly, but YouTube proceeded to delete the views, the likes, and the ability to like the video, all in an attempt to censor it from search results. Shortly afterwards, they broke their own TOS and deleted my entire channel (for the 3rd time, I am now on my 4th YouTube channel).
Content from External Source
None of the recommended videos were Flat Earther or conspiracy theory videos.

If Eric Dubay makes a claim, that is prima facie evidence that the claim is false.

I just checked out his YouTube channel. (I feel dirty, but someone has to do it.)

The channel has hundreds of videos, mainly but not exclusively on flat earth, going back over a year, most of them with hundreds of 'likes'. They include several videos complaining about censorship by YouTube, which YouTube has amusingly failed to delete. If YouTube has indeed censored some of Dubay's videos, or even deleted his entire channel (which evidently hasn't stopped him starting again), could there conceivably be reasons other than his promotion of flat earth? In one of his videos complaining about censorship he mentions 'spurious copyright claims', but the very same video which Agent K refers to has a soundtrack using music by Hans Zimmer, which I'm pretty sure is unlicensed. (I know it is by Hans Zimmer because the description says so. Possibly Dubay is one of the deluded souls who believe that giving credits for the music used somehow immunises him against a copyright strike.) He also complains that a video by him about Hitler and the Holocaust was deleted, which reminds me that he is a notorious anti-Semite. If he wants to deny that, maybe he should stop referring to YouTube as 'Jewtube'.
 
"The result is a 70% average drop in watch time of this content"

They could make it almost 100% by removing the content, but they choose not to. What percentage drop do they strive to achieve? If it's 100%, then it's the same as removing the content.

Edit: Actually, the full quote is "The result is a 70% average drop in watch time of this content coming from non-subscribed recommendations in the U.S."
So even if it were 100%, it wouldn't be the same as removing the content, because subscribers would still see it. But my point still stands that YouTube wants to reduce the watch time but doesn't remove the content.
 
Last edited:
"The result is a 70% average drop in watch time of this content"

They could make it almost 100% by removing the content, but they choose not to. What percentage drop do they strive to achieve? If it's 100%, then it's the same as removing the content.

I think they want to make it so that YouTube itself is not promoting the content. I suspect their goals are relatively complex and relate to advertising revenue, liability issues, and potential future regulation, as well as making the world a better place.
 
I think they want to make it so that YouTube itself is not promoting the content. I suspect their goals are relatively complex and relate to advertising revenue, liability issues, and potential future regulation, as well as making the world a better place.

If they think the content is so bad, why not remove it?
 
as well as making the world a better place.

:) I personally doubt that. Things like Sandy Hook Hoax or drinking bleach to cure cancer hurts their brand and probably ad revenue. If they wanted to make the world a better place they would have done it without a thousand people from Las Vegas, Sh parents, etc freaking out on their company.
 
:) I personally doubt that.

As I said, I suspect their goals are relatively complex. They are not a charity. Their corporate goal is not "make the world a better place, no matter what the effect on our bottom line." They want to make money, but I know that there are people who work at Alphabet (Google, YouTube, etc) that also want to make the world a better place. So that probably has some effect on corporate decisions.
 
Last edited:
There are also some ethical and slippery-slope concerns. Who decides what is "borderline"? Sure, most people agree that Flat-Earth is just nonsense, but fewer people agree chemtrails are nonsense, and quite a significant percentage of people actually believe "blatantly false claims about historic events like 9/11." What about when we get into political debunking? Does YouTube get to suppress videos that make borderline political claims that 30% of the people in the county think are true? Probably not (yet), but the line is fuzzy.
The problem is not only "where do you draw the line" but also how do you accurately classify the videos in the first place. How does YouTube ensure they're only suppressing flat earth videos and not any videos that are debunking flat earth videos? As Agent K mentioned, you will sometimes see information panels under pro-vaccination and pro-NASA videos. I wouldn't presume that those videos are NOT being suppressed just because we know they're anti-conspiracy, the problem is that these decisions are being made by algorithms automatically. As YouTube is fond of pointing out, the amount of video content that is uploaded every minute far outstrips any human moderation team's ability to watch it.

I have seen Apollo wikipedia information panels occasionally pop up in my own videos about SpaceX launches and landings, and I have reason to suspect that my videos are occasionally being suppressed in search results. I think their classifier is rather bad at distinguishing pro vs anti conspiracy videos in some cases, but as long as they see it promoting "authoritative" news sources coverage of these launches I bet that's all that matters to them. False positives against minor channels probably doesn't rise to the level of a problem they feel they need to correct. I have noticed a dramatic shift in the traffic sources analytics on my channel's launch videos in the last year or so. Whereas the traffic I have received from external sources like Reddit has remained more or less constant (particularly for the ever-popular Falcon Heavy launches) the amount of traffic coming from within YouTube itself fell off a cliff, despite an increase in subscribers over that time period. I also did some testing searching with a randomized keyword list that should lead to those videos. When searching with a US IP address my videos appear much farther down in the search results than when searching in another country using a VPN despite using the same keywords in the same randomized order. It doesn't happen with all channels though, my friend filmed the same launches and uploads to his channel showed similar rankings in both domestic and foreign based search results.

Bottom line, though YouTube would never admit it, I think their algorithms end up suppressing a lot of content incorrectly, but unlike with algorithmic demonetization they don't even tell you when your videos are being impacted.
 
Bottom line, though YouTube would never admit it, I think their algorithms end up suppressing a lot of content incorrectly, but unlike with algorithmic demonetization they don't even tell you when your videos are being impacted.

My channel is currently demonetized because of some old Jihadi John clip I uploaded for reference when looking at the claims that it was staged. I thought the strike was going to expire, but it didn't, and there's no actual recourse for appealing if you are a small channel. It's not like I was actually making any money from it, but it seems like I was swept into a bucket with the actual purveyors of bunk.

It may well be that my debunking videos are de-ranked too. All the chemtrail debunking stuff gets the same contrail infobox as pro chemtrail videos.

Metabunk 2019-12-09 09-52-27.jpg
 
I have seen Apollo wikipedia information panels occasionally pop up in my own videos about SpaceX launches

I don't know if this is a thing really, but if you contact them about the information panel pop ups and explain you are not a conspiracy theorist or whatever, they may be able to tag your channel as "ok run-of-the-mill science stuff". which might help your recommendations. (a lot of viewers coming from reddit might also be a red flag. i'm guessing on that)

I'm sure there are youtube videos on "gaming the system". A few of the channels I watch in between have been talking about problems and having to speak with youtube. Changing video titles seems to be an issue. Also watch what you SAY in videos, if you speak. their bots listen to the videos now and flag videos based on key words.

All the chemtrail debunking stuff gets the same contrail infobox as pro chemtrail videos.
but Dane Wigington still does not have an info panel even though i've tried alerting YouTube to that multiple times.
 
but Dane Wigington still does not have an info panel even though i've tried alerting YouTube to that multiple times.

Some of his videos do. I suspect it's because he avoids using the term "chemtrail", and so it's only when he talks about contrails a lot. it does seem like a failing in the algorithm.
Metabunk 2019-12-09 10-06-16.jpg
 
If Eric Dubay makes a claim, that is prima facie evidence that the claim is false.

I just checked out his YouTube channel. (I feel dirty, but someone has to do it.)

The channel has hundreds of videos, mainly but not exclusively on flat earth, going back over a year, most of them with hundreds of 'likes'. They include several videos complaining about censorship by YouTube, which YouTube has amusingly failed to delete. If YouTube has indeed censored some of Dubay's videos, or even deleted his entire channel (which evidently hasn't stopped him starting again), could there conceivably be reasons other than his promotion of flat earth? In one of his videos complaining about censorship he mentions 'spurious copyright claims', but the very same video which Agent K refers to has a soundtrack using music by Hans Zimmer, which I'm pretty sure is unlicensed. (I know it is by Hans Zimmer because the description says so. Possibly Dubay is one of the deluded souls who believe that giving credits for the music used somehow immunises him against a copyright strike.) He also complains that a video by him about Hitler and the Holocaust was deleted, which reminds me that he is a notorious anti-Semite. If he wants to deny that, maybe he should stop referring to YouTube as 'Jewtube'.

Yeah, I'm not going down the rabbit hole of fact checking his complaints. Apparently the video he was talking about is actually another flat earther's remix of a previous video of his, and they're mirroring each other's videos. I just thought it was notable that I had to directly go to his channel due to YouTube's suppression of Flat Earther videos, and his second-most-recent video was complaining about censorship and promising that "starting next week I will start uploading certain videos ONLY to BitChute, LBRY, Brighteon and NOT YouTube."
When I search for one of his videos by name, how the midnight sun works on flat earth, his video shows up ninth from the top. When I search with quotes, "how the midnight sun works on flat earth", his video is third from the top, and the top two results are other channels mirroring his video.
I've previously brought up the overlap between Flat Earthers and anti-Semitism: Brother Ernest, Tila Tequila, B.o.B., Dubay, etc.
 
Last edited:
I wouldn't presume that those videos are NOT being suppressed just because we know they're anti-conspiracy, the problem is that these decisions are being made by algorithms automatically... Bottom line, though YouTube would never admit it, I think their algorithms end up suppressing a lot of content incorrectly, but unlike with algorithmic demonetization they don't even tell you when your videos are being impacted.

I presumed that the NASA videos are not suppressed because they show up at the top of the search results for "moon landing" despite having the Apollo 11 information panel linking to Encyclopedia Britannica.
But you're right that small pro-NASA and debunker channels may get mistaken for borderline content.

YouTube's blog post explains that YouTube uses external evaluators following public guidelines to train machine learning models
https://youtube.googleblog.com/2019/12/the-four-rs-of-responsibility-raise-and-reduce.html
So how does this actually work? Determining what is harmful misinformation or borderline is tricky, especially for the wide variety of videos that are on YouTube. We rely on external evaluators located around the world to provide critical input on the quality of a video. And these evaluators use public guidelines to guide their work. Each evaluated video receives up to 9 different opinions and some critical areas require certified experts. For example, medical doctors provide guidance on the validity of videos about specific medical treatments to limit the spread of medical misinformation. Based on the consensus input from the evaluators, we use well-tested machine learning systems to build models. These models help review hundreds of thousands of hours of videos every day in order to find and limit the spread of borderline content. And over time, the accuracy of these systems will continue to improve.
Content from External Source
I wonder if the machine learning actually looks at the contents of the video or just the metadata like the title and description. And when they say that there's a "70% average drop in watch time of this content," did they use humans to verify that "this content" was indeed borderline, or could it include a lot of false positives?
 
I presumed that the NASA videos are not suppressed because they show up at the top of the search results for "moon landing" despite having the Apollo 11 information panel linking to Encyclopedia Britannica.
But you're right that small pro-NASA and debunker channels may get mistaken for borderline content.

YouTube's blog post explains that YouTube uses external evaluators following public guidelines to train machine learning models
https://youtube.googleblog.com/2019/12/the-four-rs-of-responsibility-raise-and-reduce.html
So how does this actually work? Determining what is harmful misinformation or borderline is tricky, especially for the wide variety of videos that are on YouTube. We rely on external evaluators located around the world to provide critical input on the quality of a video. And these evaluators use public guidelines to guide their work. Each evaluated video receives up to 9 different opinions and some critical areas require certified experts. For example, medical doctors provide guidance on the validity of videos about specific medical treatments to limit the spread of medical misinformation. Based on the consensus input from the evaluators, we use well-tested machine learning systems to build models. These models help review hundreds of thousands of hours of videos every day in order to find and limit the spread of borderline content. And over time, the accuracy of these systems will continue to improve.
Content from External Source
I wonder if the machine learning actually looks at the contents of the video or just the metadata like the title and description. And when they say that there's a "70% average drop in watch time of this content," did they use humans to verify that "this content" was indeed borderline, or could it include a lot of false positives?
Don't forget, YouTube also keeps beating the drum that they are making deliberate efforts to raise up "authoritative sources" in the search results. That would surely include official NASA YouTube channels for the various NASA centers and NASA TV. It would also probably override any false positives created by their other machine learning algorithms. You would hope they would use the false positives to retrain the algorithm, but even that is no guarantee that smaller creators will avoid being suppressed by false positive detections. Personally I would just like to see which of my videos are triggering any suppression algorithms, but I doubt they'll ever show that to try to avoid people gaming the system.

I think it's pretty obvious that YouTube is going to be more motivated by a desire to suppress fringe content as a reaction to controversy than they are to make sure they minimize suppression of smaller content creators. On the contrary, I think they're adopting a "shoot first, ask questions later" stance where "trusted creators" (ie, large channels) and "authoritative sources" are automatically given top billing in the search results. Even without direct suppression, smaller creators will naturally be bumped down in search results just by the external pressure they're exerting to force authoritative and trusted sources up to the top. The only way to avoid it would be to avoid making videos on subjects where such forces are applied (news stories, current events, or controversial issues).
 
Last edited:
Now here's a "false flag operation", pun intended.
https://www.independent.co.uk/life-...ot-combat-videos-animal-cruelty-a9071576.html
YouTube has removed hundreds of videos showing robots battling other robots after claiming they are in breach of its rules surrounding animal cruelty. YouTuber and robot enthusiast Anthony Murney was one of the first to highlight the issue, who blamed a new algorithm introduced by YouTube to detect instances of animal abuse.
Content from External Source
Those videos were actually removed, not just suppressed. Had they been merely suppressed, the algorithm's error might not have been caught.
 
the algorithm's error

sounds more like the algorithm's independent [ie artificial] intelligence is growing. reinstating those videos will probably piss it off. o_O

"On the 8th day Machine just got upset.
A problem Man had not forseen as yet."
Hazel O'conner
 
I am not sure that suppressing information is the answer.
Sure, a small amount of Sandy Hoaxers committed borderline child abuse, and other types of abuse against grieving parents, but that should have been dealt with as a criminal offence enforced by the law, not dealt with as a misdemeanor resulting in a youtube strike, or, facebook censorship, and the correct people need to be held responsible for this. It is not the responsibility of youtube, nor it's users, and if a person has committed a criminal offence then that needs to be dealt with by law enforcement. I do not even believe that Youtube nor any corporation should actually have the ability to decide or assume responsibility for anything else other than making sure they report crime to Police.
Sandy Hoaxers are also not the only ones that committed criminal acts, but are a good example.

I think overall, the arguments for suppressing information is perhaps borderline mass hysteria, something humanity is well known for.
But i think overall, a lot of suppressing will be getting done for more trivial reasons than the information having a huge adverse effect on society, and will be simply because it is an unpopular opinion, and most people like their opinions to be the only one, or, to protect the news industry.

Overall, i would say that the world has become a better place since the internet and youtube.
I think the world is smarter, healthier and happier than before youtube, and i a believe happiness to be important in good health, and longlivety, and life expectancy is up.

ww1, ww2, 9/11, all happened before youtube was created, so the world was not any better then.

I appreciate that people always want to strive to improve and make the world a better place, but that does not mean they are making the world a better place.

Perhaps the anger of living in a world full of ignorance with no social media, and a feeling that we are being spoonfed by MSM, that led to 9/11, was a bigger cause of the 9/11 catastrophe, than the conspiracy theories that followed the event.

Perhaps one should be careful not to return to that, and look at the positives, instead of the negatives, and try to counter misinformation in a better and more enlightening way, than simply suppressing it.
 
If all the "moon landing was a hoax" vids were suppressed or deleted (example), how could inquisitive people determine the truth ? All viewpoints need to be expressed, for many people to be convinced one way or another.
I understand what your saying ,but wouldn't it be better to answer those questions you had through more credible sources. ? The Apollo hoax is based on a belief in pseudoscience,not actual science.
 
I just noticed that videos from Chinese state media about the coronavirus have the coronavirus info panel in place of the Chinese government funding disclosure that appears under videos on other topics.
Compare the info panels under these two videos from CGTN.
1584830778868.png1584830731205.png

Also, the coronavirus panel now links to the CDC instead of the WHO, even if I set my location to a different country.
 
Last edited:
I just noticed that videos from Chinese state media about the coronavirus have the coronavirus info panel in place of the Chinese government funding disclaimer that appears under videos on other topics.
I suspect that means they can only have on info panel, and spreading info about Coronavirus (with the link to the CDC) is thought to be more important than identifying potential bias in a news source.
 
I suspect that means they can only have on info panel, and spreading info about Coronavirus (with the link to the CDC) is thought to be more important than identifying potential bias in a news source.

Yup, but why not have more than one info panel? I personally find the media source disclosure more informative.
 
Also, the coronavirus panel now links to the CDC instead of the WHO, even if I set my location to a different country.
Are you certain that changing your location has an effect?
If you'd like to test, that's a German video, it doesn't link to the WHO or the CDC, and the channel has a state media insert normally. This is what it looks like in Germany:
image.jpeg
Youtube also offers support information on the insert, following "Why is this message displayed?" gives this page:
image.jpeg
The green box translates to this: Note: Because of the respiratory illness (COVID-19) caused by the Corona virus, you may see information boxes for the World Health Organization WHO on this topic. In some countries and regions, these information boxes get translated into the local language and contain links to corresponding resources like national health agencies.
The bottom paragraph states: Countries that have this information box available are: Germany, Spain, South Korea, United Kingdom, USA
The yellow box at the top begins: The latest information on how we are handling the Coronavirus (COVID-19) situation can be found at https://g.co/yt-covid19 .
 
Last edited by a moderator:


YouTube has deleted the account of David Icke, a conspiracy theorist who has touted the myth that the COVID-19 pandemic is linked to 5G.

Icke's account was removed after YouTube decided he broke its rules on sharing information about COVID-19, a disease which has infected more than 3.4 million people worldwide, according to numbers tracked by Johns Hopkins University.

A spokesperson for the video sharing site told Newsweek: "YouTube has clear policies prohibiting any content that disputes the existence and transmission of Covid-19 as described by the WHO [World Health Organization] and the NHS [the U.K's healthcare system].

Content from External Source
https://www.newsweek.com/david-icke...has-youtube-channel-shuttered-sharing-1501641
 

The Four Rs of Responsibility, Part 2: Raising authoritative content and reducing borderline content and harmful misinformation
Tuesday, December 3, 2019

... So over the past couple of years, we've been working to raise authoritative voices on YouTube and reduce the spread of borderline content and harmful misinformation. And we are already seeing great progress. Authoritative news is thriving on our site. And since January 2019, we’ve launched over 30 different changes to reduce recommendations of borderline content and harmful misinformation. The result is a 70% average drop in watch time of this content coming from non-subscribed recommendations in the U.S.
Content from External Source
The above block post from YouTube is the first actual number I've seen on the effectiveness of their recent changes clamping down on misinformation. And it's a pretty significant number, a 70% reduction is huge. The type of content that is being reduced is:
borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
Content from External Source
In other words, just the type of disinformation that Metabunk is fighting against. So this is a good thing, mostly.

Why "mostly"? I think it's a net good, in that it will reduce the spread of this disinformation. But at an individual level, it might have a type of backfire effect, where people already convinced of a slew of conspiracy theories will see this as evidence of a cover-up of those conspiracy theories and hence become even more convinced.

There are also some ethical and slippery-slope concerns. Who decides what is "borderline"? Sure, most people agree that Flat-Earth is just nonsense, but fewer people agree chemtrails are nonsense, and quite a significant percentage of people actually believe "blatantly false claims about historic events like 9/11." What about when we get into political debunking? Does YouTube get to suppress videos that make borderline political claims that 30% of the people in the county think are true? Probably not (yet), but the line is fuzzy.

Still, I think the reduction in exposure to the more obviously wrong conspiracy theories is a good development. There's a chunk of the population who are vulnerable to them, and if they get sucked into the conspiracy theory rabbit hole, it makes it very easy to sway them to make life and voting decisions based on misinformation. Stopping people from becoming flat-earthers isn't automatically going to make them happy and productive members of society, but it's a solid step in the right direction.


YouTube has deleted the account of David Icke, a conspiracy theorist who has touted the myth that the COVID-19 pandemic is linked to 5G.

Icke's account was removed after YouTube decided he broke its rules on sharing information about COVID-19, a disease which has infected more than 3.4 million people worldwide, according to numbers tracked by Johns Hopkins University.

A spokesperson for the video sharing site told Newsweek: "YouTube has clear policies prohibiting any content that disputes the existence and transmission of Covid-19 as described by the WHO [World Health Organization] and the NHS [the U.K's healthcare system].

Content from External Source
https://www.newsweek.com/david-icke...has-youtube-channel-shuttered-sharing-1501641


Its also a bit of Nannystateism, and reduces the access to some content by those who enjoy exploring the weirder side of Youtube, but as its a private entity what can you do?
 
My channel is currently demonetized because of some old Jihadi John clip I uploaded for reference when looking at the claims that it was staged. I thought the strike was going to expire, but it didn't, and there's no actual recourse for appealing if you are a small channel. It's not like I was actually making any money from it, but it seems like I was swept into a bucket with the actual purveyors of bunk.

It may well be that my debunking videos are de-ranked too. All the chemtrail debunking stuff gets the same contrail infobox as pro chemtrail videos.

Metabunk 2019-12-09 09-52-27.jpg

some of the strikes are political as well, a friend three channels on YT, one prop 2nd ammendment, one about archery and one about camping, etc. Without warning all three were banned. Never saw the pro-gun one so dont know if it was radical or not, but all banned, and even though an appeal was filed, no response, nothing he could do. Their system itself is a bit flawed, if a channel is monetized, even more so.
 
Its also a bit of Nannystateism, and reduces the access to some content by those who enjoy exploring the weirder side of Youtube, but as its a private entity what can you do?
The thing is that incitement to violence doesn't become legal just because it's weird (and I don't think combating that has anything to do with nannies, state or otherwise).
I guess if nobody had burned any 5G towers, Icke would still be there.
 
Back
Top