YouTube is not really recommended for children. Including in its “filtered” mobile version, YouTube Kids, on which often circulating shocking videos, even downright violent. Hence the interest in reconsidering the option of parental control.
If you have kids, it can be tempting to put your smartphone or tablet in their hands when you’re out, or just want to keep them busy, and connect them to YouTube. But have you thought about the unthinkable? Without wanting to fall into the most prosaic paranoid, just know that you would be wrong to blindly trust this video platform. Because you never know what you can find there, between a cat video and a cartoon. Why not an erotic video, or a piece of violent film?
Fortunately, do you think, Google has thought of everything, and offers a mobile application dedicated to children: YouTube Kids. A service available in France since the end of 2016, where there is selected content, as well as parental control features, and a system that filters videos, in order to “protect” our dear toddlers from “inappropriate” content.
Suicide promotion videos
Except even on the kids’ YouTube, there seems to be something rotten. An American pediatrician, Free Hess, recounted last February how she discovered, in the middle of a cartoon, a short sequence explaining to children how to kill themselves. According to her, as she watched a video with her son, it would have stopped for a few seconds, before showing a sequence in which a man, facing the camera, faked self-harm on his wrist. And explained how, roughly speaking, not to miss – before the cartoon resumes.
This video, which had been circulating quietly on YouTube Kids for 8 months, and of which she had time to take a screenshot, has since been deleted, but how many children have viewed it before that? “I think it’s extremely dangerous. Our children face a whole new world with social media and access to the Internet. It changes the way they grow and develop. Such videos put them at risk, ”says Free Hess. According to CBS News, the author of the videos of incitement to suicide is the YouTuber “Filthy Frank”, a “humorist” followed by 6.2 million people – and there is no evidence yet that it is him, or someone who took his images to edit them elsewhere.