Joerg mentioned that Youtube somewhat trains larger content creators how to make sure they don't get flagged by AI by not including bad keywords in the title.
This is not "AI", there is no intelligence to this at all. It's a joke, just a look-up list.
AI is a long, long, long way away from being able to make a good decision about anything remotely debatable.
The following Group 1:
"Make your own weapons: A Flyswatter"
"How to kill fruit flies"
"Murdering spider infestations by sprinkling Borax at your baseboards"
Would all likely be grouped into the same category as the following Group 2:
"Make your own weapons: killing a teacher with a homemade rifle"
"How to kill someone and get away with it."
"Murdering your classmates"
As with also the following Group 3:
"News: Police catch Youtuber before he could make his own weapons"
"News: Man sentenced for killing his mailman."
"News: Community safe from potential murderer."
...
Even though Group 1 is harmless and even child-friendly. Group 2 is horrid and dangerous. Group 3 is reporting facts on events same as primetime TV news would, even without disturbing images.
The AI isn't "learning" if it's not being trained, and if it's not being trained by humans actually reviewing the videos and trying to mimic their decision making, then it's not doing anything. There is nothing to "wait" for, to give it a chance to do. Youtube is simply half-assing a lookup table, calling their process an "AI" to make you think it is judging these videos in an intelligent manner, and refusing to actually tackle the problem.
Does the content of any of the Group 2 (actually bad, against policy) videos become more ad-friendly or child-friendly or family-friendly if their titles were missing those keywords but otherwise remained the same? If the images shown were corpses?
It's bullshit and we're being lied to.