Karyis
procrastinating as we speak
- Location
- Texas
- Pronouns
- He/Him
YouTube just banned supremacist content, and thousands of channels are about to be removed
don't know if this goes in here or News&Politics, but since it's related to a social media site made for sharing videos (eg, tech related) I figured here would be a safe bet. Also if there's already a thread for this, sorry, I searched a little and I couldn't find it.
Ahem, anyway, I personally think that while this is a really good thing that needs to happen and finally is, I'm worried about that algorithm accidentally flagging totally legit people and LGBTQ+ content, as they have messed up real bad with that in the past. Is there going to be any human oversight on this? I've heard their appeals process already takes literally forever, so if they're serious about getting rid of all the Nazis they're going to need to step up in order to make sure they're getting the right people. Not to mention, as the first article already states, they're not getting rid of an alt-right commentator that has repeatedly harassed a LGBTQ+ Vox employee in the past (which several other news sites have commented on, too).
What do you think?
"The openness of YouTube's platform has helped creativity and access to information thrive," the company said in a blog post. "It's our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence."
The changes announced on Wednesday attempt to improve its content moderation in three ways. First, the ban on supremacists will remove Nazis and other extremists who advocate segregation or exclusion based on age, gender, race, religion, sexual orientation, or veteran status. In addition to those categories, YouTube is adding caste, which has significant implications in India, and "well-documented violent events," such as the Sandy Hook elementary school shooting and 9/11. Users are no longer allowed to post videos saying those events did not happen, YouTube said.
don't know if this goes in here or News&Politics, but since it's related to a social media site made for sharing videos (eg, tech related) I figured here would be a safe bet. Also if there's already a thread for this, sorry, I searched a little and I couldn't find it.
Ahem, anyway, I personally think that while this is a really good thing that needs to happen and finally is, I'm worried about that algorithm accidentally flagging totally legit people and LGBTQ+ content, as they have messed up real bad with that in the past. Is there going to be any human oversight on this? I've heard their appeals process already takes literally forever, so if they're serious about getting rid of all the Nazis they're going to need to step up in order to make sure they're getting the right people. Not to mention, as the first article already states, they're not getting rid of an alt-right commentator that has repeatedly harassed a LGBTQ+ Vox employee in the past (which several other news sites have commented on, too).
What do you think?