When Tim was researching this article he would send me examples from these videos asking if they were things that would be moderated on Ars. Every time it was an easy "oh yeah, for sure" answer.
Moderating is hard. But sometimes it's easy.
That's every YouTube creator ever. YouTube's algorithms are opaque (even to YouTube a lot of the time) and change constantly and their creator support is basically non-existent. Even huge successful channels just basically throw up their hands and shrug when asked about details of things like that, because nobody knows.On several occasions, YouTube demonetized some of their videos, blocking them from generating revenue via ads. They say these decisions were made based on vague criteria with no meaningful opportunity to appeal
a term used by some Black people as a derogatory for a Black person who sells out the African American community.
In other cases, it's simply not clear if the plaintiffs have a viable cause of action. The plaintiffs make a big deal out of YouTube's demonetization decisions. But US law—including the First Amendment—gives advertisers and ad networks broad discretion to decide whom they want to do business with. YouTube or its advertisers, for example, would be well within their rights to decide they would rather not run advertisements on videos about controversial political topics—or on channels by controversial figures. It's entirely legal for advertisers to take sides in political debates. So even if it were true, as plaintiffs claim, that YouTube was systematically demonetizing videos about the Black Lives Matter movement, that wouldn't be against the law.
Oh good, the stupid is here.Time to remove section 230 protections for companies that engage in editorial behavior.
https://www.techdirt.com/articles/20200 ... -act.shtmlTime to remove section 230 protections for companies that engage in editorial behavior.
a term used by some Black people as a derogatory for a Black person who sells out the African American community.
It's not analogous to an "Uncle Tom" as it's just a general slur for black people generating from and primarily used in the Cajun south.
Agreed. The farmers who wont let me put up my Pro-abortion message on the billboards on their land need to have their rights revoked. They cant be publishers and platforms! \sTime to remove section 230 protections for companies that engage in editorial behavior.
https://www.techdirt.com/articles/20200 ... -act.shtmlTime to remove section 230 protections for companies that engage in editorial behavior.
It was here 5 comments earlier than that.Oh good, the stupid is here.Time to remove section 230 protections for companies that engage in editorial behavior.
Jinx!https://www.techdirt.com/articles/20200 ... -act.shtmlTime to remove section 230 protections for companies that engage in editorial behavior.
Can you explain Dandere what you hate so much about gardening, car or aircraft forums for example? Even if you aren't into any of those things, or have any hobbies at all, why don't you want other people to be able to enjoy them and keep them on topic rather then having every single forum devolve in a generic mess of anarchy? Here on Ars Technica for example we've got a wide range of forums devoted to various platforms, kinds of hardware, software, science, business, political discussion, etc. Why should that be banned?Time to remove section 230 protections for companies that engage in editorial behavior.
Yeah, no, the DMCA's restrictions on false claims are completely toothless and actually incentivize the use of dumb automated takedown systems (of which YouTube's is the best, just to give you a sense of how low the bar is).content claiming abuse
We need to enact laws to enforce the companies to use proper channel for DMCA claims. YouTube sidestep those process by creating their own copyright claim (basically skirting around DMCA), this allow the companies to outright abuse the process. DMCA is strict against false claim and will punish them.
Knowing that your algorithms can’t be “knowingly, intentionally, and systematically” racist is probably one of the few things tech companies do actually know they can’t do and consistently avoid doing. Intentional racial discrimination against users is well outside the scope of Section 230 immunity, if you can prove it, but there’s no goddamn way a company the likes of Google/YouTube isn’t already doing at least some basic level of vetting to ensure their algorithms are at least factually neutral. Those algorithms get used internationally, and putting aside the risk of race discrimination lawsuits in the US, in a growing number of countries (including all of the European Union, per GDPR) there are explicit laws forbidding even the inclusion of race as an identified or inferred data element in the algorithm without the user’s express consent.YouTube "knowingly, intentionally, and systematically employs artificial intelligence algorithms, computer and machine based filtering and review tools to 'target' Plaintiffs and all other persons similarly situated, by using information about their racial identity and viewpoint to restrict access and drive them off YouTube," the lawsuit states.
Yeah, YouTube creators have 99 problems and I guess emergent racism from an algorithm could actually be one as well.
I’m not sure that this is lawsuit-worthy, but it’s certainly something YouTube should be vigilant for, even if this particular claim were to be shown to be inaccurate.
That's how YouTube works for everyone, everywhere. We're all equal in the eyes of YouTube, that is, we're the ones that create content for their platform that they get paid for, there to be screwed out of our money whenever YouTube feels like it.On several occasions, YouTube demonetized some of their videos, blocking them from generating revenue via ads. They say these decisions were made based on vague criteria with no meaningful opportunity to appeal.
The plaintiffs have also apparently had some difficulties with YouTube's copyright takedown system. The lawsuit indicates that multiple plaintiffs had videos taken down over copyright concerns—perhaps due to the inclusion of significant clips of copyrighted videos that were included for purposes of commentary.