The smartphone is at the ready, the console is running hot, videos or games don’t stop – many parents want more offline time for their child. This is not about banning or denigrating media. Digital media is a natural part of growing up today. A good balance between online and offline time in everyday life is crucial.
Children and young people need regular breaks without screens. Offline time allows for exercise, creative play, relaxation and real encounters with others. This strengthens imagination, concentration and independence and helps to reduce stress, even if children don’t always realize it themselves.
The older children get, the more important media use becomes for friendships and belonging. For young people in particular, the worry of missing out plays a major role. For some, media is even the most important way to stay in touch with friends or a community, for example when like-minded people don’t live in the immediate vicinity. It is then often difficult to switch off because chats, games or videos are socially important. This is where an open discussion about alternatives can help.
If media is available at all times, many children do not stop independently. This is why children need guidance and support, e.g. through fixed offline times or media-free places such as the dinner table and before bedtime. It is important not to use media time as a reward or punishment, as this can further increase its importance.
Strict prohibitions often lead to arguments or secret use. It is more effective to involve children and find solutions together.
Offline time works better when it feels good:
Technical settings can support without controlling. Set up screen time limits or break timers (e.g. in YouTube Kids) together. This makes the transition to offline playful and transparent.
Children closely observe how adults use media. Consciously put the smartphone away and share: “I’m taking a cell phone break now!” Commonrules such as “no cell phones at mealtimes” apply to everyone and are the most convincing.
Irritability or frustration when switching off is normal. Take feelings seriously: “What are you missing right now?” or “What is difficult without a screen?”. If conflicts persist or hobbies fade, take a closer look and get support (e.g. counseling).
Battlefield 6 is a first-person shooter in which players fight from a first-person perspective in large-scale online battles. Military weapons, vehicles and realistic war scenarios take center stage. The game is action-packed and violent – and is therefore aimed at older teenagers.
The game is the latest installment in the well-known Battlefield series and continues it with the latest technology and familiar gameplay. Like Call of Duty or Counter-Strike, it is one of the great first-person shooters. In the single-player campaign, players experience a war story in the year 2027 and fight as an elite unit against a private military organization.
In multiplayer mode, the focus is on large online team battles where tactical teamwork is important. Battlefield Portal also sees the return of a creative community tool that allows players to design and share their own game modes and experiences.
Battlefield 6 is aimed at players who like tactical shooters and team play. The game offers large maps, lots of action, vehicles and explosions – this is exciting and creates suspense. The mixture of combat, teamwork and planning is what makes the game so appealing and keeps many young people playing for a long time.
At the same time, the game is a social meeting place: young people play online with friends or meet new people. Playing as a team and pursuing common goals conveys a sense of community. Leaderboards, progress and rewards provide additional motivation: young people can constantly improve, master challenges and show their progress.
Battlefield 6 is played from a first-person perspective, players see the game world directly through the eyes of their character – including the weapon. War and violence are clearly the focus, which is why the game has been given a USK 18 rating due to its drastic depictions of violence. In single-player mode, violence is visibly staged as part of the war story; in multiplayer mode, it is primarily part of the game mechanics and tactics (e.g. battles, explosions, vehicle combat). Here, violence serves less of a narrative and more of a competition between teams. There is no ethical evaluation of war or violence in the game.
In-game purchases also play a major role in Battlefield 6. Advertisements for additional purchases or cosmetic content are often displayed to players, which can create pressure to spend real money.
Especially in multiplayer modes, in-game communication is essential in order to chat or talk to other players. This promotes successful cooperation – but can sometimes lead to communication risks such as cyberbullying, hate speech and cybergrooming.
The provider Electronic Arts (EA) sets out the following rules, among others, in its terms of use:
The detailed terms of use, privacy policy and terms and conditions for Battlefield 6 are available on the EA website.
As a parent, pay attention to the USK age rating from 18 years and talk to your child about why this rating is important. The game shows theaters of war and battles very realistically with visual effects such as blood, wounds, explosions and battles. The violence is explicitly visible, not abstracted or highly stylized – and this is a key reason for this age rating. Furthermore, war is at the heart of the entire game. Players control soldiers, use real weapons and find themselves in conflict situations in which characters are killed. Due to the high immersiveness (i.e. the strong feeling of being in the middle of the action), players are very intensively involved, which requires a certain level of maturity. Activate the parental control settings on your child’s devices. This will prevent your child from downloading the game unsupervised.
If your child is playing the game because they are already of age or you consider them to be sufficiently mature, accompany them:
Which learning platform or app is right for my child? And do they even need it at preschool age? Many parents are faced with precisely these questions. Websites and apps with learning opportunities can help children discover, practise and revise. However, they are no substitute for learning together, playing or exercise. The decisive factor is how and for what they are used. We present five popular offers in Germany and classify what parents should look out for.
Most of the services presented can be used both in the browser and as an app. The range of functions differs in some cases.
Sofatutor offers learning content from pre-school age to upper school. For younger children, there is Sofatutor Kids with learning games, short videos and exercises on numbers, colors, letters, first arithmetic problems and factual topics. The content is clearly structured and based on educational plans. Parents can create child profiles and view learning progress.
ANTON is one of the most popular learning apps for preschool and school and is often recommended or used by schools. In addition to the widely used app, learning can also take place online in the browser. Children practice math, German, general knowledge or music in short, manageable units. ANTON is ad-free and designed without time pressure.
Antolin is a digital reading promotion program that is mainly used in schools. Children read books offline and then answer questions about the content to collect points. The focus is clearly on reading motivation.
Duolingo teaches foreign languages in a fun way with short exercises, repetitions and rewards. Even children can learn their first words and simple sentences. Both the website and the app are colorful and motivating, but rely on regular use.
Scoyo is aimed at children from around 4 to 12 years of age. The learning platform offers exercises and learning games on German, math and specialist topics, sorted by age and grade level. Parents can create profiles and track learning progress.
Not every learning app or platform is suitable for every child. Age recommendations can be a guide, but say little about whether an offer really suits your own child. The decisive factors are interests, stage of development and individual learning speed. Some children love structured tasks, others learn better through free experimentation, movement or conversation. If an activity causes frustration or your child quickly loses interest, it is not (yet) the right choice. And that’s perfectly fine.
At pre-school age, the focus is not on practicing, but on playful discovery. Children gain their first experiences with numbers, letters or language and learn primarily through curiosity and repetition. Digital learning opportunities can provide stimuli here, but they should be entertaining and not create too much pressure. Supervision is important: talk to your child about what they are seeing and trying out.
When children start school, their needs change. Content now needs to be repeated and consolidated more frequently. Learning platforms and apps can help with this, for example with arithmetic, reading or learning vocabulary. They are well suited as a supplement, but not as a substitute for explanations, homework or joint discussions.
Regardless of age, learning platforms or apps are no substitute for reading aloud, free play or exercise. Make sure there is variety and agree clear times and breaks. Don’t ask your child about points or levels, but about what they have understood or newly discovered. In this way, learning remains positive and digital learning opportunities become what they can be: meaningful support in everyday family life.
Instagram, WhatsApp, YouTube or TikTok – the internet is not a legal vacuum. Anyone who uses social networks or messengers should know the basic rules. This applies to adults as well as children and young people. As a parent, you can help your child to use photos, videos, texts and personal data responsibly. And also set a good example yourself.
From the very first steps online, it is important to introduce children to Instagram and how to handle personal data. Vivid comparisons help: Does your child not want personal details or secrets to be shared in class? The same applies online.
Advise your child to check whether the information is really necessary before sharing it. This includes name, telephone number, address, date of birth, photos, videos, messenger IDs, location data or passwords. Personal data of others may also only be shared with their consent.
Also take a look at app permissions together. Not every app needs access to location, contacts, microphone or camera. Check the settings when you first start the device and regularly after updates.
Parents should also reflect on their own online behavior. Sharing children’s photos or information (“sharenting”) can have long-term consequences, for example through AI-generated deepfakes. Children have a right to privacy, even from their parents.
On platforms such as YouTube, TikTok or Instagram, there is a lot of content that has been uploaded but not created by the user. This includes music, films, series, texts, images, graphics and computer games. These works are protected by copyright and may only be shared publicly with the consent of the copyright holder.
It becomes problematic, for example, if a dance video with a protected music title is uploaded or a picture of a well-known sportswoman is used as a profile picture. Screenshots, memes or short video clips can also be relevant under copyright law. Infringements can result in fines of up to 500,000 euros or account suspensions.
There is content with free licenses, for example under Creative Commons(CC) licenses, which may be used and shared depending on the license. The name, license link and changes must be specified. In addition, photos and videos are now often edited with filters or AI tools. Rights and personal rights remain intact; an image does not automatically become “free to use” just because it has been technically modified.
The right to one’s own image is part of personal rights and applies to all people, including children. Photos or videos may only be published or passed on if the person depicted has given their consent. In the case of underage children, the parents generally decide. However, as they get older, children should be involved in an age-appropriate manner and be able to participate in decision-making. This applies to public posts as well as messenger messages.
This is often underestimated, especially in class or group chats. Screenshots or forwarding without consent are legally problematic, even if they are only shared with friends. As children get older, they should decide for themselves what content to share. Agree clear rules with family and friends and check the privacy settings of the apps. Messengers are not a legal vacuum.
In recent years, platforms have been more heavily regulated, for example by the Digital Services Act (DSA, fully implemented since 2024) or the General Data Protection Regulation (GDPR). Providers must take risks for minors into account: no personalized advertising, age-appropriate algorithms, rapid reporting of harmful content and uniform complaints procedures.
Nevertheless, responsibility remains in everyday family life. Instagram, privacy settings and a conscious approach to content cannot be completely outsourced to platforms. Many conflicts do not arise from bad intentions, but from ignorance, peer pressure or insecurity.
Learning new things and doing homework with chatbots, playing music via voice command on smart speakers, and receiving content recommendations tailored to personal tastes—artificial intelligence (AI) is an integral part of our everyday lives. Children and young people in particular use AI tools as a matter of course, but not always consciously. AI technologies are developing rapidly and continuously. It is not easy for parents to keep track of everything: Which applications are particularly popular with young people? What opportunities, challenges, and risks arise from their use?
At the parents’ evening, we will introduce you to the most popular AI applications and look at their significance in the everyday media lives of adolescents. One focus will be on the risks for children and young people when communicating with chatbots, e.g., disinformation, inappropriate or problematic responses, and when interpersonal relationships are replaced.
Join us live, get practical tips on how to introduce your child to the safe and responsible use of (generative) AI, and ask our experts your questions—we will provide answers and are available for discussion!
The virtual parents’ evening is an event held as part of Safer Internet Day 2026 – you can find out more about the campaign day at klicksafe.
Date: February 11, 2026 | Time: 5:00 p.m. to 6:00 p.m.
Procedure: (Media education) input (approx. 40 minutes) followed by an open discussion
Speakers: Lidia de Reese und Nils Rudolf (FSM)
Moderation: FSM e.V.
Platform: The virtual parents’ evening is realized via the “Zoom” tool.
Privacy Notice: Zoom is a service of Zoom Video Communications Inc. which is based in the USA. We use Zoom via the German operator WTG. The WTG server is located in Europe. Furthermore, within the Zoom service we have chosen the configurations with the highest data and security protection.
Please also take note of our privacy policy.
Registration:
Children and young people encounter AI in many places today: Chatbots answer questions for homework, voice assistants help in everyday life, creative apps generate images, music or short stories. This can be exciting, inspiring and confusing at the same time. Many parents therefore ask themselves: How do I guide my child so that they use AI curiously, safely and critically without being overwhelmed? Don’t worry: you don’t have to be an AI expert to accompany your child safely.
Artificial intelligence is often surprisingly clever. It responds quickly, friendly and sometimes more convincingly than adults. However, children should understand: AI does not “know” anything. It merely calculates which answers are likely to seem correct. And that is precisely why it can make mistakes, adopt prejudices or invent content.
Many AI applications also save the data entered. Depending on the tool, more or less information can be collected. Children should therefore learn early on to handle personal information with care. It should also be clear that AI is no substitute for personal advice, teachers or parents.
The younger children are, the more guidance they need when dealing with AI. For primary school children in particular, it is important to try things out together, ask questions and scrutinize results.
Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.
The smell of cookies, shopping stress, shining children’s eyes: the holidays are approaching and digital devices and games are on the wish lists of many children and young people. What should parents consider before and after giving a gift? Between the years and during the vacations, there is also time for shared family media experiences. How can this be designed in a safe, age-appropriate and even creative way? In this article, we give you an overview of offers from the Elternguide.online partner network.
The Christmas vacations can be long, especially when the weather outside doesn’t really invite you to play. If you have devices such as smartphones, tablets, cameras or a laptop at home – how about you and your child just get started? You can take photos or film together, try out new creative apps and actively organize media time. It’s great fun and your child will also learn something about media skills along the way.
On the website kinder.jff.de there are suggestions for simple media projects that children aged 3 and over can do at home with the support of you as parents. This is helped by child-friendly video instructions in which the implementation of the media projects is shown step by step. How about a photo memory with Christmas tree decorations or an audio story about Christmas traditions?
knipsclub offers a safe environment for young photo fans between the ages of 8 and 12 to try out their skills in a closed and pedagogically supervised photo community and exchange photos with each other. On the website you will find creative photo tips, for example on
You are probably familiar with challenges from social media, e.g. dance challenges on TikTok. Children and young people love to take on challenges. Challenges don’t necessarily only have to take place on the Internet, you can also play them at home with your family! Why not try out the top photo challenge, the clip challenge or the re-enactment challenge? We have made a few suggestions in our parents’ guide article. You can find more Advent challenges on the website of the JFF project webhelm.de.
Children have many questions and learn early on that their questions will be answered on the Internet. How is Christmas celebrated in other countries? What craft tips and baking recipes are there for Christmas? The children’s search engine fragFINN offers children access to around 3,400 verified websites, including almost 400 children’s sites. Primary school children can gain their first Internet experience here in a protected surfing room and learn how to use search engines and search results. In the fragFINN Advent calendar, children can open a little door every day, behind which are links to other children’s sites with a wide range of information and offers suitable for the winter season. You can find more playful learning pages in this parents’ guide article.
Your child probably also likes watching videos and going to the movies. A TV evening together can be a really nice family experience. Pay attention to the FSK age rating to protect your child from unsuitable content. But be careful: FSK ratings are
Parents must decide individually when their child is ready for their first smartphone, depending on their level of development and experience. After all, a smartphone theoretically opens up the whole world of the Internet to your child, with all its opportunities and risks. klicksafe offers comprehensive information for parents. Use the smartphone readiness checklist to check whether a smartphone is ready to go under the Christmas tree. Has the decision been positive? Then find out about the technical setting options and set up your smartphone to be childproof. You can find all information material from klicksafe in this topic special.
Would you like to prepare your child for the first smartphone under the Christmas tree? Child-friendly information on the first smartphone is available in the children’s magazine Genial Digital from Deutsches Kinderhilfswerk. The fragFINN app gives your child a protected surfing space on their first smartphone and gives them access to quality, positive content.
Are you considering buying a smartwatch as an alternative? Then take a look at this topic from klicksafe. Please note: technical protective measures are no substitute for family discussions and media rules. Stay in contact with your child and accompany them as they take their first steps with a smartwatch or smartphone.
In addition to discussions and media rules, technical youth media protection is an important component of media education. Use the screen time and digital wellbeing settings on smartphones to set time limits for the entire device or for different apps and to filter content. The parental control program JusProg offers a precise filtering option for websites and safe default settings for mobile devices and laptops. Google Family Link and YouTube Kids offer the opportunity to make media experiences safer for your child in the world of the internet giant Google. Social media apps such as Instagram and TikTok also offer safety features and parental guidance options. Streaming with the family can be a fun activity during the vacations. Almost all streaming services have certified offers for the protection of minors. Use your own child profiles and the parental control function with the PIN. Many of the youth protection programs have been approved by the expert commission of the Voluntary Self-Regulation of Multimedia Service Providers (FSM). You can find out more about technical solutions for the protection of minors in the media on the FSM website.
Detailed instructions for all devices can be found on the website medien-kindersicher.de.
Which games should I give my child for Christmas? Are games okay for preschoolers? In the family section of the USK website and via the USK brochures, you will find all the information you need on the USK’s age ratings, the additional information and how to deal with the subject of games in the family.
The USK mark indicates the age at which a game does not cause any developmental impairments. The additional information such as “fantasy violence” or “pressure to act” gives parents a good indication of whether a game is suitable for their own child. Educational assessments of games can be found at the NRW games guide. Descriptions of popular games like Fortnite, Minecraft or Roblox are available on Elternguide.online. The USK lexicon explains the most important terms, devices and genres.
Would you like to make your child happy with a game for Christmas? Find out about the distribution channels for games and technical precautionary measures. Various settings for the protection of minors can be made on consoles as well as in game stores and the games themselves. Play together with your child and ask them interesting questions about their favorite games.
The team at Elternguide.online wishes you and your family a wonderful Christmas season and lots of fun using media safely and creatively!
Arrange a meeting with your best friend via online message, ask about homework in a class chat or chat digitally with friends about the latest soccer transfer rumors. According to the JIM study, WhatsApp is the most used app among young people. But at what age can the messenger be officially used?
The USK (Entertainment Software Self-Regulation Body) has approved WhatsApp for ages 12 and up. The USK checks the age at which online games and apps are considered suitable in Germany. Its age ratings are based on the provisions of the German Youth Protection Act and the Interstate Treaty on the Protection of Minors in the Media. The reason for the 12+ label is that WhatsApp is a messenger app whose content cannot be checked in advance. Thanks to the public community function, WhatsApp also contains functions that are similar to social media platforms such as Facebook or Instagram. At the same time, however, WhatsApp also has block and report functions. Among other things, a USK 12 rating is awarded if an app or game has a chat function that includes the usual moderation tools and safeguards. If the age of your child is stored in the Google Play Store, the app can only be installed from the age of 12.
WhatsApp itself specifies a minimum age of 13 years in its General Terms and Conditions. The T&Cs are rules for using an online service. Before using the app, your child must confirm that they are at least 13 years old. Whether this is true is not actively checked. However, if it becomes known that a user is under the age of 13, WhatsApp has the right to delete the account.
According to the GDPR (General Data Protection Regulation), WhatsApp may process data from the age of 16 without parental consent. If your child is younger, WhatsApp needs your consent as a parent or guardian. By allowing your child to use WhatsApp, you also consent to the messenger service processing your child’s data (e.g. cell phone number).
As parents, you can take 13 years as a guide. Children under the age of 12 should not use the offer under any circumstances. Officially applies:
If your child is between the ages of 13 and 16, they may use WhatsApp, but they need your consent. Set up rules together with your child, explain the risks to your child and explain the app’s reporting and blocking functions to your child. Set up the account together and pay particular attention to security and privacy settings. You can find more tips on setting up WhatsApp safely here at medien-kindersicher.
Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.
What words did you use when you were younger – and what did your parents think? Think about it. It is perfectly normal that you sometimes do not understand your child because they use words that you do not know or use.
Our language is shaped by the adult world. Adolescents have a strong need to form their own identity, to become independent and to distinguish themselves from adults. This is also expressed in the so-called youth language with which they create their own world. With their own language, or at least their own terms, they create something of their own and typical of youth. This connects and creates self-confidence. Using the “outdated” slang of the parents would sound old-fashioned and uncool.
In each new generation, in certain youth scenes and even in different places, youth language changes and there are different words and expressions. Typically, young people speak more easily than adults. In doing so, they sometimes use unusual or unfamiliar terms. The language is less “correct” because young people speak more spontaneously. Instead, it transports much more feelings and moods.
Online communication also has a strong influence on how young people speak or write. In messenger chat, for example, emojis, GIFs, stickers and memes are used. The language is significantly abbreviated and often incorrect. Terms from the
Check out your knowledge around your child’s language. The following words will be particularly popular in 2025. At the end of the post you will find the resolution.
So there is no serious reason to worry if you sometimes do not understand your child. Respect the desire for boundaries and generally be understanding if your child uses different words than you do. But that doesn’t mean you have to let him tell you everything. Especially if the language is indecent, hurtful or offensive. Tell your child. Make it clear to them why they should not speak to others in the same way and, if necessary, agree on rules for dealing with each other. The use of youth language can also be problematic in online communication, for example in trash talk in online games or cyberbullying. Talk to your child about respectful behavior online. The same rules that apply offline for fair and respectful interaction should also be observed online: Fairness, openness and respect.
You are and will remain the adult from whom your child wishes to distinguish him or herself. Therefore, do not try to approach linguistically. This is more likely to be perceived by your child as an invasion of his or her privacy. Speak the way you always do. Nevertheless, you can occasionally enjoy your child’s imaginative word creations and ask if you don’t understand something!
Resolutions:
An idea for the next birthday present, help with a math problem or simply a question about the opening times of the swimming pool: many families now use artificial intelligence in their everyday lives. At the same time, many parents are wondering which AI services are reliable, safe and also suitable for children. After all, not all AI is the same; the offerings differ significantly in terms of data protection, transparency and target groups, for example.
It’s obvious: AI tools can make everyday life easier. Chat bots and so-called AI agents find and bundle information, explain complicated content, take on small tasks or even act as conversation partners. Many services – from search engines to messengers – now have integrated AI functions. This makes them even easier to access and convenient to use.
At the same time, this also increases risks. Some services place little value on data protection and privacy, collect extensive usage data or are difficult to understand. Others deliver results that are difficult to assess or are not always correct. The seemingly simplest services, which are already integrated into apps and websites, are not always the best. Unfiltered content can also be problematic for children.
The search for suitable AI offerings is not easy. Many tools appear similar and yet differ greatly. We present three possible alternatives:
Perplexity AI – the AI search engine with transparent sources
Perplexity AI works like a chatbot, but compiles information more like a search engine. One advantage is that the AI displays the sources used for each answer. This makes it easier to assess the results and can be particularly helpful for young people doing research for school or projects. Perplexity AI can be used free of charge in the browser or with a paid Pro account, which saves searches, uses better AI models and enables more uploads. The provider states that it uses data for further development, but does not sell it. The service is not specifically aimed at children, but can be a more transparent tool for searches if used with supervision.
Duck AI – the AI with a focus on data protection and privacy
DuckDuckGo is known as a data-saving search engine and browser. The browser does not collect any data, does not record IP addresses and blocks advertising. With Duck AI, the service now also offers its own AI function, also with a strong focus on data protection: queries are sent to the models anonymously, chats are not saved and data is not passed on for training purposes. The AI can be used free of charge, even without your own DuckDuckGo browser. This can be a reliable alternative for parents for whom the protection of personal data is particularly important. Again, this is not a special offer for children, but an option with clearly communicated data protection.
KinderGPT – the AI especially for children
KinderGPT is a German service that is specifically aimed at families. Content is filtered and prepared in an age-appropriate way. Parents can limit usage times or approve subject areas. According to the provider HillcrownAI, no personal data is passed on. The basic version is free, additional functions can be activated for a fee. KinderGPT does not replace supervision, but offers a protected environment in which children can gain their first experiences with AI.
Which AI service is suitable depends largely on what it is to be used for. Together with your child, think about where your priorities lie: quick and easy use, high security or verifiable results? If you are aware of what you expect from an AI, you can also look for ‘the right one’.
Stay up to date on what AI can do and where the pitfalls lie. AI offerings are developing rapidly and new offerings are constantly being added. It is therefore important to stay informed about new possibilities and limitations.
Regardless of the service you choose: Discuss safe behavior when using AI with your child. Explain what data can be given to an AI – and what should remain private. Find out together how AI works and how you can question and check results to be sure. Also be aware of your role as a role model.
And finally: Get to know contact points together where you can find support if problems arise.
Of course, you know your child’s friends from school or the sports club. But what about acquaintances with whom your child has contact only via the Internet? Whether online gaming, in video chats or via social media – wherever children and young people communicate with each other, they can come into contact with people who have negative intentions. According to the JIM Study 2024, almost a third of the 12-19-year-olds surveyed had experienced sexual harassment online.
Platforms such as Instagram, TikTok or Discord, which children and young people like to use, also attract users with paedocriminal tendencies. This is criminal behavior directed against minors. Adults approach children with the aim of sexually harassing or even abusing them. This targeted approach online is called cybergrooming.
The strategies are varied – but they are always aimed at gaining the trust of children and young people and gaining control over communication.
A clear warning signal is if the stranger wants to move the chat to a private messenger such as WhatsApp or Telegram – because nobody can read it there. This chat should remain secret at all costs. In such private conversations, trust is built up that can later be exploited. This is often followed by a request for intimate photos or videos. These can then be used for
You can find out more about this problematic phenomenon in this video from the Kinderschutzbund:
You can find more tips on how to protect your child from sexual violence on social networks here.
Children and young people are particularly curious. Therefore, they sometimes forget all warnings and can fall into a trap. Make it absolutely clear to your child that even in such cases, the perpetrator is solely to blame. Cybergrooming – even attempts – are punishable in Germany. If your child is sexually harassed, be there for your child and do not blame him or her. Report the perpetrators on the respective platform or to reporting centers. Secure evidence by taking screenshots of the chat history and profile and make sure you report it to the police! But be careful with depictions of abuse of children and young people. It is a criminal offense to possess them. The Internet Complaints Office has summarized information on dealing with depictions of abuse online in this PDF. You can find more information on this topic in this article.
Children and young people can find help, advice and information here:
The following contact points are available for parents:
You can find more digital advice for children, young people and parents here.
Having their own tablet can be exciting for children: playing games, reading books, getting creative or using educational apps. However, many parents ask themselves: when is it worth giving them their own device – and when is the family tablet enough?
A shared tablet makes sense in the early years. Your child can try out content while you accompany, explain and restrict what they see and do. One family device is often enough to gain initial experience with apps, videos or games and to reflect on media use together.
It makes sense for your child to have their own tablet if they want to use media independently on a regular basis, pursue their own interests or use learning apps that require personal accounts. For children from around 6 to 7 years of age, having their own device can be useful if you clearly regulate and supervise their use.
It’s not age that matters, but your child’s maturity: Can they follow rules? Do you know how to surf the net safely? Can it reflect on content and distinguish between games, learning, and entertainment? Only when these skills are in place is it worthwhile to purchase your own device.
A tablet for children should be robust, intuitive to use and not too expensive. Simple devices with a sturdy casing and a child-friendly interface are good entry-level options. Check whether educational apps, audio books or creative tools are useful and whether in-app purchases and advertising can be deactivated.
Whether it’s a family device or your own, set usage times, content, and rules together. Parental Controls, family accounts, and youth protection apps help to control media use. Discuss together: How long should your child be allowed to use the tablet? What content is permitted?
Having your own tablet can promote digital independence if you provide guidance: explain functions, try out learning apps together, and talk critically about advertising, algorithms and content. This will teach your child to use media consciously and reflectively.
Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.
“Just ask a chatbot!” – This is a tip that parents now often hear when it comes to quickly searching for information or support with tasks. ChatGPT has long since become part of everyday life not only for adults, but also for children and young people. We provide an insight into what the tool can do, how it works and what you should look out for.
ChatGPT is an AI-based chatbot that analyzes texts and writes them itself. It can answer questions, provide ideas or summarize longer texts in natural language and often with astonishing precision. The AI has been trained with huge amounts of data from books, websites and conversations and mimics human communication. ChatGPT can also be integrated into other services, such as WhatsApp or Instagram.
The free version uses the GPT-4o-mini model with training data up to around October 2023 and can also access the internet to a limited extent. The Plus version has comprehensive internet access and offers additional functions.
Question: I am planning my son’s birthday. Can you help?
ChatGPT: Sure! How old will your son be and what does he particularly like? Then we can collect ideas for games, decorations or food.
Question: He’ll be nine and loves dinosaurs.
ChatGPT: How about a dinosaur treasure hunt in the garden? You can hide little figures and draw a map. Or bake dino cookies – would you like me to find you a recipe?
With “Family Pairing”, OpenAI offers the option of linking parent and child accounts in order to offer underage users more protection. The aim is to show
ChatGPT is easily accessible, fast and versatile. It can write creative texts, answer complex questions and communicate in several languages. For many children, it feels like they are talking to a real person. The tool encourages creativity, language comprehension and structured thinking, while supporting digital independence.
Children and young people use ChatGPT for example for:
OpenAI emphasizes:
Try it out together: Test ChatGPT together. Then discuss which answers are useful or problematic. This will help your child learn how to deal critically with AI.
Use the family pairing function: Check the settings and activate protection mechanisms if necessary.
Encourage critical thinking: Explain that ChatGPT does not provide “truth” but recognizes patterns. Answers can be wrong or contain prejudices.
Keep an eye on data protection: Do not disclose any personal data together, i.e. no names, addresses, telephone numbers or photos. If your child uses ChatGPT via WhatsApp or other apps, discuss the fact that additional data is shared there.
For school: ChatGPT can support learning, for example to explain difficult terms simply or to summarize texts. However, it should not be used for homework. Make it clear to your child: AI is a tool that can support learning, but cannot take over.
Keep the conversation going: Encourage your child to speak up if they feel uncomfortable or receive inappropriate content. AI can be exciting, but should always be used consciously and critically.
It can happen that children or young people come across content online that depicts violence. When scrolling through social media, on video platforms or when such videos are shared in group chats in Messenger. This may involve fights, abuse, accidents or drastic images from war zones. Such content can frighten and disturb children or young people. It also violates the right to one’s own image of the person concerned and may even be punishable by law.
Under no circumstances should such content be redistributed. This prevents even more people from seeing the content. If the content is illegal, publishing and redistributing it can even have criminal consequences.
We can also encourage young people to do even more by not ignoring such content, but actively reporting it. Look together to see where the report function can be found on the platforms used, such as Instagram, Snapchat, TikTok or YouTube. According to the terms of use, violent and cruel content is not permitted and should be deleted. Messages or people can also be reported on WhatsApp.
The online advice platform for young people Juuuport even offers a separate reporting option for young people, e.g. for violent videos, extremism or hate speech. The internet complaints offices FSM, eco and jugendschutz.net then take care of these complaints.
Violence also plays a role in fictional media content, such as films, series or games. Content that is easy for older children, teenagers and adults to process can frighten young children. This is why there are age restrictions for depictions of violence in the media, which are regulated by the protection of minors in the media. The age labels of films or computer games show you as parents from what age the content shown is suitable.
However, children and young people can also come across real depictions of violence online. An experience report from Juuuport clearly shows how differently young people deal with violent videos online and can make it easier for you to start the conversation.
Talk to your child about possible negative experiences online. Show them the reporting and blocking functions on the relevant platforms and discuss how algorithms select content. Also explain how to ‘maintain’ or reset this content to keep your child’s online experience safer and healthier. Talk about what it means when photos or videos have a ‘sensitive content warning’ and what your child should do if this happens. Also find out about the legal regulations for the protection of minors in the media and about technical protection options via apps and settings for your child on devices or in individual services.
Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.
WhatsApp is the most popular messenger, even among children and young people. Almost everyone uses it to send messages, make calls or share status updates. It is important for parents to know the functions, risks and developments, especially since AI functions have been integrated into the app.
WhatsApp is a free messenger that is registered via the cell phone number. Contacts must be shared in the address book on the smartphone so that messages, photos, videos,
Self-deleting messages that disappear after seven days and photos or videos that can only be viewed once are particularly practical. Chats can be locked with a code or fingerprint. Group administrators can delete messages from others, making it easier to moderate content.
The search function helps you to quickly find specific messages, links or media across all chats. This makes Messenger easier to use, but can also tempt you to bring up old conversations or private content that was actually forgotten.
Since the integration of Meta AI, an AI has been supporting users in formulating messages, summarizing long chats and suggesting emojis or answers. It can also answer questions, similar to a chatbot. The use of these functions is optional; users decide for themselves whether they want to use them. Channels on which
WhatsApp is quick, easy and always available. Young people use it to stay in touch, discuss homework or simply to belong. They share impressions of their everyday lives via status messages and profile pictures, similar to Instagram.
The new channels make WhatsApp even more attractive. Young people follow influencers there and receive trends, challenges and tips directly in the app. Having your own channels encourages creativity and organization, but can also increase the desire for reach or recognition.
The AI functions are also fascinating: An automatic writing assistant that suggests texts saves time and seems practical. However, young people should understand that this AI is not a neutral conversation partner, but learns from their input.
The read confirmation (“blue checkmark”) can create pressure to reply immediately. Large groups, such as
Another risk is data processing by Meta. WhatsApp collects a lot of information: Contacts, profile and location data, device and usage information. This can be used for personalized advertising or to train the AI.
The integration of meta AI brings additional challenges. Many users do not know what data the AI processes or stores. Emotional bonds can also develop with chatbots, especially when children talk about personal topics. Such conversations often seem human, but they are not.
Influencer channels can show content that is unsuitable for children. Advertising, idealization and one-sided portrayals are common, often without clear labeling.
WhatsApp belongs to the Meta Group. The provider emphasizes that chats are end-to-end
According to the provider, Meta AI should make it easier to use and deliver personalized results. At the same time, this means that the AI learns from the user’s data. Parents can check whether the AI functions are active in the settings. On some devices, this is only visible to users under 18. Use can be restricted or rejected if the device and app allow it.
Children under the age of 16 may only use WhatsApp with parental consent. Talk together about responsible use: What information can be shared? Who is allowed to see messages?
Discuss risks such as bullying, inappropriate or dangerous content (e.g. pornographic material), “fake news” or emotional attachments to AI chatbots. Encourage your child to get help if they have unpleasant experiences.
Set the data protection settings together. Pay attention to your child’s privacy and give them the freedom they need. The privacy check (Settings ” Privacy ” Privacy check) shows at a glance which settings are active and who is allowed to see what. You can find step-by-step instructions at www.medien-kindersicher.de.
Your child should only allow contacts that they know in real life. Discuss the responsible use of your own channels and influencer content. Be a role model for respectful communication yourself.
Explain that AI answers are not always correct and do not replace real conversations. Keep the conversation going and show interest in your child’s digital world – without mistrust, but with awareness.
If you are looking for messengers with stronger data protection, you can use signal, Threema or NYZZU to avoid them.