Pictures of your child on YouTube, discriminatory statements in comments, nasty jibes in the Messenger group or forbidden symbols: Content on the Internet can cause trouble for many reasons. However, they don’t appear anywhere on the web, but concretely on platforms, apps and websites. And there you can also report this.
Providers try to set rules and monitor them to the extent they can. There are even laws for this, such as the Youth Protection Act. When content is reported, the platform must check within a short time whether it is punishable content and delete it. For this, the platforms depend on the participation and help of the users. Whether on YouTube. WhatsApp, Instagram or TikTok: Everywhere, you or your child can report content that may violate platform policies or a law. For example, reporting racist content has increased dramatically in recent years, according to YouTube and Facebook.
Reviews and comments from other users or parents can also be helpful with apps or games if you are unsure if they are suitable. For example, the app stores often indicate when there are a lot of ads or in-app purchases, or when a game is very violent. If you or your child have had your own experiences, or if content seems suspicious to you, share it with others, taking advantage of this valuable opportunity to support each other.