Elternguide Logo

Eltern-Check-In: Ab wann sollte mein Kind ein eigenes Tablet bekommen?

Having their own tablet can be exciting for children: playing games, reading books, getting creative or using educational apps. However, many parents ask themselves: when is it worth giving them their own device – and when is the family tablet enough?

Family device or your own tablet?

A shared tablet makes sense in the early years. Your child can try out content while you accompany, explain and restrict what they see and do. One family device is often enough to gain initial experience with apps, videos or games and to reflect on media use together.

It makes sense for your child to have their own tablet if they want to use media independently on a regular basis, pursue their own interests or use learning apps that require personal accounts. For children from around 6 to 7 years of age, having their own device can be useful if you clearly regulate and supervise their use.

Maturity and independence are decisive

The decisive factor is not the age, but the maturity of your child: Can they follow the rules? Do they know how to surf the net safely? Can they reflect on content and differentiate between games, learning and entertainment? Only when these skills are in place is it worth having your own device.

The right device

A tablet for children should be robust, intuitive to use and not too expensive. Simple devices with a sturdy casing and a child-friendly interface are good entry-level options. Check whether educational apps, audio books or creative tools are useful and whether in-app purchases and advertising can be deactivated.

Safe and accompanied use

Whether it’s a family or personal device: define usage times, content and rules together. Parental controls, family accounts and parental control apps help to control media use. Reflect together: How long is your child allowed to use the tablet? What content is allowed?

Promoting media literacy

Having your own tablet can promote digital independence if you accompany them: Explain functions, try out educational apps together and talk critically about advertising, algorithms and content. In this way, your child learns to use media consciously and reflectively.

Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.

ChatGPT

“Just ask a chatbot!” – This is a tip that parents now often hear when it comes to quickly searching for information or support with tasks. ChatGPT has long since become part of everyday life not only for adults, but also for children and young people. We provide an insight into what the tool can do, how it works and what you should look out for.

In a nutshell:

  • AI-based chatbot that has been trained with huge amounts of text
  • Functions: Answers questions, writes texts, helps with creative tasks and can be integrated into messenger services such as WhatsApp
  • Developer: OpenAI
  • Use: In the browser or as an app (Android, iOS); free basic version; paid Plus version (approx. €20/month) provides faster responses, Internet access and multimodal capabilities (images, videos)
  • Age rating: According to the provider from 13 years; between 13 and 18 years only with parental consent; no age check

What is ChatGPT?

ChatGPT is an AI-based chatbot that analyzes texts and writes them itself. It can answer questions, provide ideas or summarize longer texts in natural language and often with astonishing precision. The AI has been trained with huge amounts of data from books, websites and conversations and mimics human communication. ChatGPT can also be integrated into other services, such as WhatsApp or Instagram.

The free version uses the GPT-4o-mini model with training data up to around October 2023 and can also access the internet to a limited extent. The Plus version has comprehensive internet access and offers additional functions.

A sample conversation

Question: I am planning my son’s birthday. Can you help?
ChatGPT: Sure! How old will your son be and what does he particularly like? Then we can collect ideas for games, decorations or food.
Question: He’ll be nine and loves dinosaurs.
ChatGPT: How about a dinosaur treasure hunt in the garden? You can hide little figures and draw a map. Or bake dino cookies – would you like me to find you a recipe?

With “Family Pairing”, OpenAI offers the option of linking parent and child accounts in order to offer underage users more protection. The aim is to show age-appropriate content and help children use it safely without monitoring them. Parents can set rules together with their children and reflect on how AI is used.

What fascinates children and young people about it?

ChatGPT is easily accessible, fast and versatile. It can write creative texts, answer complex questions and communicate in several languages. For many children, it feels like they are talking to a real person. The tool encourages creativity, language comprehension and structured thinking, while supporting digital independence.

Children and young people use ChatGPT for example for:

  • Help with homework, presentations or organization
  • Creativeprojects (e.g. writing poems or stories, developing ideas for games)
  • Foreign language exercises or explaining difficult terms
  • Virtual entertainment and communication

What can be problematic?

  • Incorrect information: ChatGPT can make mistakes or “invent” answers (so-called hallucinations).
  • Data protection: Personal data provided in the chat can be stored and analyzed. This is particularly true when using messenger services such as WhatsApp or Instagram.
  • Inappropriate content: Despite filtering, inappropriate or problematic answers may appear.
  • Apparent familiarity: The empathic communication can promote emotional bonds, even though ChatGPT is not human.
  • Dependency: Children may rely too heavily on ChatGPT instead of doing their own research or thinking critically.
  • Limited Internet access: The free version can currently use the Internet, but only to a limited extent and with limited access numbers. For comprehensive, fast Internet access, you need the Plus subscription.

What does the provider say?

OpenAI emphasizes:

  • Filters and moderation: Inappropriate or dangerous content is automatically blocked.
  • Transparency: ChatGPT occasionally indicates that information should be verified.
  • Protection of minors: Use under the age of 13 prohibited, from 13 only permitted with parental consent. However, there is no real age check.

What parents should pay attention

Try it out together: Test ChatGPT together. Then discuss which answers are useful or problematic. This will help your child learn how to deal critically with AI.

Use the family pairing function: Check the settings and activate protection mechanisms if necessary.

Encourage critical thinking: Explain that ChatGPT does not provide “truth” but recognizes patterns. Answers can be wrong or contain prejudices.

Keep an eye on data protection: Do not disclose any personal data together, i.e. no names, addresses, telephone numbers or photos. If your child uses ChatGPT via WhatsApp or other apps, discuss the fact that additional data is shared there.

For school: ChatGPT can support learning, for example to explain difficult terms simply or to summarize texts. However, it should not be used for homework. Make it clear to your child: AI is a tool that can support learning, but cannot take over.

Keep the conversation going: Encourage your child to speak up if they feel uncomfortable or receive inappropriate content. AI can be exciting, but should always be used consciously and critically.

Parent check-in: What should I do if my child sees violent videos online?

It can happen that children or young people come across content online that depicts violence. When scrolling through social media, on video platforms or when such videos are shared in group chats in Messenger. This may involve fights, abuse, accidents or drastic images from war zones. Such content can frighten and disturb children or young people. It also violates the right to one’s own image of the person concerned and may even be punishable by law.

What to do: Do not continue to share, but report

Under no circumstances should such content be redistributed. This prevents even more people from seeing the content. If the content is illegal, publishing and redistributing it can even have criminal consequences.

We can also encourage young people to do even more by not ignoring such content, but actively reporting it. Look together to see where the report function can be found on the platforms used, such as Instagram, Snapchat, TikTok or YouTube. According to the terms of use, violent and cruel content is not permitted and should be deleted. Messages or people can also be reported on WhatsApp.

The online advice platform for young people Juuuport even offers a separate reporting option for young people, e.g. for violent videos, extremism or hate speech. The internet complaints offices FSM, eco and jugendschutz.net then take care of these complaints.

Education and protection

Violence also plays a role in fictional media content, such as films, series or games. Content that is easy for older children, teenagers and adults to process can frighten young children. This is why there are age restrictions for depictions of violence in the media, which are regulated by the protection of minors in the media. The age labels of films or computer games show you as parents from what age the content shown is suitable.

However, children and young people can also come across real depictions of violence online. An experience report from Juuuport clearly shows how differently young people deal with violent videos online and can make it easier for you to start the conversation.

Talk to your child about possible negative experiences online. Show them the reporting and blocking functions on the relevant platforms and discuss how algorithms select content. Also explain how to ‘maintain’ or reset this content to keep your child’s online experience safer and healthier. Talk about what it means when photos or videos have a ‘sensitive content warning’ and what your child should do if this happens. Also find out about the legal regulations for the protection of minors in the media and about technical protection options via apps and settings for your child on devices or in individual services.

Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.

WhatsApp – the number 1 messenger app

WhatsApp is the most popular messenger, even among children and young people. Almost everyone uses it to send messages, make calls or share status updates. It is important for parents to know the functions, risks and developments, especially since AI functions have been integrated into the app.

In a nutshell:

  • Free messenger app for Android, iOS and web
  • Functions: Chats, voice messages, calls, video telephony, files, contacts, location sharing, group chats, central search function
  • Age rating: USK from 12 years(Google Play Store)
  • Notes on use: content for different age groups, chats
  • allowed from 13 years according to provider
  • Additional features: AI integration(Meta AI), influencer channels, own channels, self-deleting messages, chat blocks

What is WhatsApp?

WhatsApp is a free messenger that is registered via the cell phone number. Contacts must be shared in the address book on the smartphone so that messages, photos, videos, voice messages, files or the location can be sent. Group calls and video calls are also possible.

Self-deleting messages that disappear after seven days and photos or videos that can only be viewed once are particularly practical. Chats can be locked with a code or fingerprint. Group administrators can delete messages from others, making it easier to moderate content.

The search function helps you to quickly find specific messages, links or media across all chats. This makes Messenger easier to use, but can also tempt you to bring up old conversations or private content that was actually forgotten.

Since the integration of Meta AI, an AI has been supporting users in formulating messages, summarizing long chats and suggesting emojis or answers. It can also answer questions, similar to a chatbot. The use of these functions is optional; users decide for themselves whether they want to use them. Channels on which influencers, celebrities, brands or journalistic media such as Tagesschau post content can be found under “News”. Children and young people can also create their own channels, for example for school projects or groups of friends.

What fascinates young people about it?

WhatsApp is quick, easy and always available. Young people use it to stay in touch, discuss homework or simply to belong. They share impressions of their everyday lives via status messages and profile pictures, similar to Instagram.

The new channels make WhatsApp even more attractive. Young people follow influencers there and receive trends, challenges and tips directly in the app. Having your own channels encourages creativity and organization, but can also increase the desire for reach or recognition.

The AI functions are also fascinating: An automatic writing assistant that suggests texts saves time and seems practical. However, young people should understand that this AI is not a neutral conversation partner, but learns from their input.

What can be problematic?

The read confirmation (“blue checkmark”) can create pressure to reply immediately. Large groups, such as class chats, are often confusing and can lead to conflict or stress. Quickly shared photos, videos or voice messages can be easily forwarded, which can encourage bullying or embarrassing situations.

Another risk is data processing by Meta. WhatsApp collects a lot of information: Contacts, profile and location data, device and usage information. This can be used for personalized advertising or to train the AI.

The integration of meta AI brings additional challenges. Many users do not know what data the AI processes or stores. Emotional bonds can also develop with chatbots, especially when children talk about personal topics. Such conversations often seem human, but they are not.

Influencer channels can show content that is unsuitable for children. Advertising, idealization and one-sided portrayals are common, often without clear labeling.

What does the provider think?

WhatsApp belongs to the Meta Group. The provider emphasizes that chats are end-to-end encrypted. This protects messages from being read, but only as long as no cloud backups are activated.

According to the provider, Meta AI should make it easier to use and deliver personalized results. At the same time, this means that the AI learns from the user’s data. Parents can check whether the AI functions are active in the settings. On some devices, this is only visible to users under 18. Use can be restricted or rejected if the device and app allow it.

What parents should pay attention

Children under the age of 16 may only use WhatsApp with parental consent. Talk together about responsible use: What information can be shared? Who is allowed to see messages?

Discuss risks such as bullying, inappropriate or dangerous content (e.g. pornographic material), “fake news” or emotional attachments to AI chatbots. Encourage your child to get help if they have unpleasant experiences.

Set the data protection settings together. Pay attention to your child’s privacy and give them the freedom they need. The privacy check (Settings ” Privacy ” Privacy check) shows at a glance which settings are active and who is allowed to see what. You can find step-by-step instructions at www.medien-kindersicher.de.

Your child should only allow contacts that they know in real life. Discuss the responsible use of your own channels and influencer content. Be a role model for respectful communication yourself.

Explain that AI answers are not always correct and do not replace real conversations. Keep the conversation going and show interest in your child’s digital world – without mistrust, but with awareness.

If you are looking for messengers with stronger data protection, you can use signal, Threema or NYZZU to avoid them.

Parent check-in: Why is Roblox now 16+?

Roblox is a very popular game among children and young people. Here you can create virtual worlds in Lego style, a mixture of game world and social network. At the beginning of 2025, Roblox received the new age rating of 16+ – we take a look at what this means for young people and you as parents.

What the youth protection authorities say

Roblox combines gaming, social media and creative design on one platform. For example, chat functions are also included, players can develop their own games and even earn money with them. In January 2025, the Entertainment Software Self-Regulation Body (USK) raised the age rating for Roblox from 12 to 16. The reasons for the increase are violent content, increased purchase incentives and a range of offers for different age groups. The USK also points out online risks, e.g. chats and in-game purchases with random content, so-called loot boxes. With these gambling-like mechanisms, it is particularly difficult for younger people to keep track of their spending. If accounts are not adequately secured, children and young people can come into contact with problematic behavior such as cybergrooming or cyberbullying.

In addition, the security measures intended to protect younger players are not sufficient: there are no age labels in accordance with German youth protection standards and children’s accounts are not secure enough. The new age rating “from 16 years” should also provide parents with better guidance.

What now? Tips for parents and families

What does this mean for you and other families? First of all, if your child (under the age of 16) already has a Roblox account, you are not obliged to delete the account due to the age rating upgrade. However, the reasons for raising the age rating clearly show that the platform harbors risks for children that need to be taken seriously. Based on this, you as parents must ultimately decide whether the platform is still suitable for your child.

You don’t want to ban use completely? Make sure you check the account restrictions together and adjust them if necessary. In any case, create a parent account and link it to your child’s account. Talk to your child about the age upgrade and why the approval has been adjusted. Think about how future use can be made safer – for example, only when accompanied by you or older siblings.
If you decide that your child should delete an existing Roblox account, talk about it together and explain your reasons. Show understanding for the fact that this decision may make your child sad or angry. Think together about what alternative, age-appropriate games your child might enjoy. You can find recommendations, for example, at the NRW games guide.

If you are already using a parental control program, depending on the age setting, games with a 16+ rating may be automatically blocked. You can read more about games here.

Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.

When parental control settings are circumvented

Does this sound familiar to you? You’ve done a lot of reading, had lengthy discussions, installed elaborate parental control programs on your children’s devices – only to find that YouTube is still on all night long. This is because the tricks for circumventing FamilyLink and the like sometimes spread faster than head lice in the school playground and online. And leave us parents rather perplexed.

Surfing despite the parental control app – how does it work?

FamilyLink or Apple parental controls, JusProg or Kidgonet – when children start to use media independently, parents often worry a lot about usage times and safety. The solution is often technical restrictions such as parental control apps that set time limits or filter content. However, after a while it often turns out that although the apps and settings allow you as parents to sleep peacefully, they are hardly an obstacle for your child to surf as they please. They reinstall apps or open a browser that cannot be recognized, click on links and detours to YouTube or simply change the time or time zone on the device. Some children even install VPN services, use camouflage apps or create guest accounts on their devices to hide themselves from the unpopular parental control settings.

This raises many questions for parents. How good are child protection programs really? How can rules be enforced? And above all: how can children be well protected when using media if the apps can be bypassed?

Why is it so easy for children to bypass the apps?

For children, bypassing the parental control settings is of course a challenge and almost a sport. Anyone who has been annoyed a few times that screen time has ended at the worst possible moment may be looking for ideas to trick them. And children and young people are quick to find them. The internet, especially YouTube, is full of ideas and instructions on how to circumvent and trick the limits, locks and settings.

Many of these “detours” can be prevented by you as parents by making the settings of the parental control programs more rigorous or by allowing less creative freedom on the child’s device. For example, you can assign admin rights so that your child cannot download or install anything without your consent. You can assign parental PINs to many devices and accounts so that only you can change the settings. There are now also tested and very secure parental control settings for games consoles, which you can use to set the usage rights and times so that there is little scope for ‘detours’. For example, the USK has tested the parental control programs for the Xbox and Switch and found them to be suitable and safe.

Not only games, apps and devices can be secured, you can also set up your router to be childproof – and much more. You can find step-by-step instructions at medien-kindersicher.de.

How can parents deal with the conflict?

But of course, youth protection should not be a race for technical possibilities. It is better to take a two-pronged approach to media education. Youth media protection solutions are still a useful tool for protecting children from difficult or dangerous content. However, families should never rely solely on technical filters. And not just because no filter can guarantee one hundred percent protection. Children grow into media worlds and they not only need to be protected from excessive demands at a young age, but above all they need to learn how to use them well and competently.

It is therefore much more important to accompany children in their media use than to install restrictions:

  • Discuss with your child where dangers lurk, why too much media use, content that is not age-appropriate or intensive use of social media can be problematic.
  • Define the rules and settings for the apps together so that your child understands them and supports them. A media usage agreement can help with this.
  • If your child bypasses a barrier, ask what their goal was.
  • Explain to him why the protection settings are important for your child.
  • Negotiate with your child and adjust the media rules and settings together if necessary.
  • Show your child the consequences of bypassing the parental control apps for their media rules.
  • Keep in touch about the child’s wishes and needs – and about your concerns.

If your child uses media in a reflective, competent manner and with trustworthy parents at their side, the question of whether app restrictions need to be circumvented secretly may even be a thing of the past. You can find more tips for everyday family life with parental control apps in this article.

Parent check-in: How much screen time is okay during the vacations?

Many parents ask themselves this question at the beginning of the vacation weeks. The answer is not so easy to give. There is no one-size-fits-all answer. Appropriate screen time depends on your child’s age, stage of development and needs – so it’s an individual decision. Talking to other parents can help, but no two children and families are the same and media rules can vary accordingly.

How much – but above all what and why!

Set rules for screen and media time depending on what media your child uses and how well they can handle it.

Shared media use and conversations about the experiences will help you to assess this well. The form of use should also play a role: So how does your child spend their time? Are they finding out about a topic that is currently of particular interest to them? Are they playing games, watching series or endlessly scrolling through social media feeds? Are they alone?

Media offer us the opportunity to inform ourselves, to be inspired, to interact with others, to distract ourselves from our hectic everyday lives – but they can also stress us out or even burden us. Especially during the vacations, it can be good to take a break from the often packed and tightly structured daily routine of school, sports clubs or music lessons. Using media can help your child relax and stay in touch with friends. However, a balance is important, such as playing indoors and outdoors, sports, arts and crafts or spending time with friends and family.

Finding rules together

Media rules are more likely to be accepted if they are made together with your child. Talk together about what media and content your child uses, when, how and why. Also think about your own media use, as parents act as role models. Establish common media rules for the whole family. Deviations during the vacations or at weekends are okay! Rules must fit your child’s development and needs as well as your everyday family life so that they can be adhered to.

You can find out more about media rules and screen time in these Elternguide.online articles and videos:

https://www.elternguide.online/regeln-fuer-die-mediennutzung-in-der-familie/

https://www.elternguide.online/medienregeln-fuer-schulkinder/

https://www.elternguide.online/der-staendige-streit-um-medienzeiten-ab-wann-ist-es-zu-viel/

https://www.elternguide.online/wann-ist-viel-zu-viel-zwischen-sucht-und-extremer-mediennutzung/

Aktivieren Sie JavaScript um das Video zu sehen.
https://youtu.be/bPw7vqI2fxA

Can’t find an answer to your question? Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.

Parent check-in: My child is getting their first smartphone – what should I look out for?

Your child’s first smartphone is an important milestone in their life. Parents ask themselves many questions beforehand: When is the right time for the first smartphone? Which device is suitable and what else needs to be considered?

Maturity is crucial

There is no universal rule that recommends a certain age as appropriate for the first smartphone. The move to secondary school is often taken as an opportunity to equip children with their first smartphone. As parents, you know best whether your child is ready for a smartphone. Your child’s maturity, skills and media experience are more important than whether they have reached a certain age. This checklist from klicksafe can help you decide.

Which device should it be? – Robust, simple and not necessarily expensive

It is a good idea to buy a smartphone with a sturdy case and/or protective cover to get started. A device with intuitive, simple operation makes sense. This way, your child will quickly understand the smartphone’s functions and find it easier to use. It doesn’t have to be the latest and most expensive model; a solid, used model is suitable for beginners. It is advisable to keep an eye on price and performance.

Protection and security

Not all smartphone functions and apps are suitable for young users. A device with options for parental controls and family sharing, such as screen time and app restrictions, is therefore recommended. Select age-appropriate apps and adjust the app settings. Regular updates and secure, up-to-date software are also important. You should also be familiar with the device. This way, you can be a reliable contact person for your child and provide reassurance. Test the smartphone together with your child to make sure it is suitable. Accompany your child as they take their first steps and explain the potential risks to them.

You can find out more about your first smartphone here.

Ask your personal questions about your child’s media use directly and conveniently using the messenger service via WhatsApp or Threema. You can find more information here.

Communication risks on the net

Chatting via Messenger, playing games together or taking part in social media trends – media enable us to be in contact with others. Children and young people face many challenges when communicating online. On Elternguide.online, we explain how you and your family can deal safely and competently with communication risks online.

Challenges of digital communication

When we write messages via Messenger, we don’t just use letters, we also like to use emojis. However, care should be taken to avoid misunderstandings. Chatting, posting and gaming is fun. However, being constantly available can overwhelm children and young people, lead to digital stress and the fear of missing out(FOMO). Be aware of your role model function and, if necessary, make technical adjustments together to regulate media use.

Contact by strangers

Whether through online gaming, video chats or social media – it’s easy to meet new people on the internet. Contact with strangers can be risky because we don’t know the person’s intentions and don’t know who is actually communicating with us. Is it really the same age gamer friend? When paedophile criminals write to children or young people to initiate sexual contact, this is known as cybergrooming. If supposedly private images such as nude photos are used to blackmail someone, this is called sextortion.

Communication with friends

Sometimes communication with friends and acquaintances can also become problematic. Among young people, there is a risk of cyberbullying, for example, via chat groups. Sexting, the sending of revealing messages and images, can be problematic in relationships. It is helpful if rules are agreed on how to deal with messenger chats. Discuss this with other parents and your child’s teachers. Talk to your child about being careful with their own data, such as nude images. Explain to them how they should deal with insults and nasty comments and make them aware of reporting points.

Dealing with AI tools

AI applications have long since arrived in the everyday lives of children and young people and automatically accompany them when they use search engines, messengers and social media. They chat with chatbots such as MyAI on Snapchat, enter into intimate relationships with AI contacts or use programs such as ChatGPT or MetaAI to collect ideas or find solutions. In doing so, they encounter challenges such as misinformation, problematic content and data misuse as well as the difficulty of distinguishing between human and machine communication. Talk to your child about the opportunities and risks of AI tools and make safety settings in the apps together. Promote your child’s critical thinking and encourage them to question answers from chatbots, check information and understand AI as a tool – not as a substitute for their own services or real friendships.

Hate and extremism

The internet is not always a friendly place. Trolls and haters launch attacks under the guise of anonymity and deliberately provoke people in comment columns. Online hate speech can spoil the fun of posting videos and photos online. Thinking carefully about what you post or share is the first step to a safe browsing experience.

Forming their own opinion is one of the developmental tasks of children and young people. During the orientation phase, they can be susceptible to simple answers and radical positions from extremists. Whether on social media, in forums, chats or in online games – children and young people can come across extreme opinions and conspiracy myths everywhere online. Make it clear to your child why they should not trust all content online. Show your child how they can check information and familiarize them with the various reporting points on the internet.

Gaming communication

Many gamers play games together, even if they are sitting in different places. When gaming, communication takes place via a headset or the chat function within a game. It is not always clear who is talking to you on the other end. If possible, players should block other people’s contacts. Gamers sometimes use harsh language, known as trash talk. If insults and conflicts escalate, this can lead to hatred among gamers. Keep in touch with your child about their favorite games and use technical youth media protection solutions.

You can learn even more about communication risks and how to deal with them in these posts:

AI & Me – When chatbots promise digital proximity

He should have blue eyes, a sporty figure and please be funny and sensitive. The perfect boyfriend can now be conjured up very quickly with just a few clicks, at least virtually. AI can be used to create platonic or romantic relationships with a seemingly flawless counterpart – so happiness in love without any heartache?

AI friendships – what is that?

Artificial intelligence (AI) is increasingly becoming part of our everyday lives. As a search engine, practical support for everyday problems – and now also as a substitute for personal relationships. When chatting via Snapchat and WhatsApp, there are AI contacts in the address book with whom you can chat just like with a human counterpart. Apps such as Anima, Replika or Romantic AI go one step further. Here, artificial friends can be generated according to your own wishes. Users can design their own appearance and character via a selection menu and put together friends or romantic partners as they wish. The chatbots can be contacted at any time and users can write, speak and sometimes even make video calls with them.

What fascinates children and young people about “digital friends”?

A friend, exactly as you want it – for many adults, but also for children and young people, this sounds tempting. And at first glance, a self-generated chatbot has many supposed advantages: The chatbot is quick and easy to create and immediately ‘ready for action’ – much faster than a friendship or romantic relationship can be established in real life. An AI bot is always available, always has time, a sympathetic ear and seems to understand (almost) everything. Unlike real friends or partners, the bot never gets annoyed, bored, jealous or angry, but is exactly the counterpart we dream of.

Especially when children and young people are going through phases of loneliness, conflicts, problems or even psychological lows, chatbots may even be able to offer helpful answers and support – which is why some people are already working on using AI as an adjunct in psychotherapy. The visual design and verbal contact options can make the exchange feel very real and human and convey a good feeling similar to that of a friendship.

Why is caution advisable?

Nevertheless, it is not a good idea to replace interpersonal relationships with AI friendships and relationships in the future. After all, the seemingly perfect counterparts have limits – as well as pitfalls. For example, it initially seems tempting to have someone to listen to who never disagrees or expresses their own needs. However, the genuine, empathetic exchange and personality that characterize other people is quickly missing. There are also other critical points:

  • AI chatbots are language models and generate their answers to personal questions from data. This can sometimes include good tips – but it can also completely miss the point, be wrong, inappropriate or even harmful. Precisely because the bot cannot read ‘between the lines’ like a human and has often not been trained in an age-appropriate manner, it cannot always respond appropriately. Sometimes this makes the situation worse instead of better.
  • If you spend too much time just dealing with an AI chatbot, you can forget how to deal with real criticism and debate. Chatbots can be trained to always be polite and not disagree. This can lead to children and young people being encouraged to think negative thoughts at all times – and potentially increase their own loneliness.
  • Particular caution is required in the case of romantic relationships via AI bots: young children and adolescents in particular can be confronted with sexual and pornographic content that is not age-appropriate or even be asked to send their own nude photos, which can be disturbing.
  • And finally, chatbots always have a financial interest, of course. They want to collect data and earn money from the apparent relationship.

What should parents pay attention to?

Talk to your child about these problematic topics and consider together which situations and topics an AI bot might be a good contact for – but also where there is a limit beyond which the ‘relationship’ becomes problematic. Make it clear to your child that security and data protection are important and that they have a right to them. Together, make sure that you do not disclose any personal data. Help your child to discover, build and maintain real friendships. Encourage your child to contact you at any time if they have any uncertainties or problems.

Media usage contract

Who is allowed to do what with media and for how long? These issues come up in every family sooner or later and not infrequently cause stress and arguments. Rules on media use can help create a structure and avoid conflicts. These can be discussed by parents and children together and recorded in a contract. The online tool for a media usage contract presented here — an offer by klicksafe and the Internet-ABC — is suitable for this purpose.

In a nutshell:

  • Free online tool, accessible via: www.mediennutzungsvertrag.de
  • Contract can be customized and personalized
  • Selection from many rule proposals
  • Own rules can be integrated
  • Creative backgrounds
  • Print directly

How does the creation of the contract work?

The tool guides you step by step to the finished paper. You can choose from two age groups (6-12 years and 12+) and choose a title design, a mascot and a background. All the rules you select are automatically inserted like building blocks, so it’s easy to keep track of them all. Each module can be edited individually. Of course, you can also insert your own rules. At the end save the document, then it can be completed at another time. You can also create multiple contracts for different children.

Tips and backgrounds

Use the building blocks as a suggestion to start a conversation in your family about media use. Some possible rules you may not have thought about, others are already self-evident. Set priorities, because the tool offers very many ideas that do not all have to be implemented. There are several types of rules:

  • General rules (such as dealing with conflicts, questionable content, handling of devices)
  • Time regulation (determination of time quotas)
  • Cell phone (how to deal with apps and data, mobile-free places, dealing with costs).
  • Internet (such as security settings, use of websites).
  • Television (such as age-appropriate offerings, sharing).
  • Games (like common games, fairness)

A contract is nothing more than written down rules that have been agreed upon. The advantage is that you can always look at it and remember it.

A special feature is that rules can also be set for parents. For example, parents can commit to not using the cell phone even at dinner, or to using adult media content only when children are not present. Because rules are easier for children to understand if everyone has to follow them and you set a good example.

Being a role model right from the start – how babies and toddlers learn to use media

You are reading a story to your child and suddenly the phone beeps to announce a new WhatsApp message. What do you do? Do you automatically reach for your smartphone or do you read the message later when the child is asleep?

Such situations probably exist in every family. When the little son then reaches for the smartphone, it is said: “That’s not for you yet!”.

Learning through observation

Be aware that parents and also other adults have an important role model function for children. Children experience how you, as their closest caregivers, deal with digital media and orient themselves to this. This is how children learn how the world works and how to behave in certain situations. Your behavior therefore has a major impact on how your child uses media themselves. By actively setting an example of what good media use can look like, you help your child learn to use smartphones and the like independently, sensibly and consciously.

Especially for younger children, parents are number one. It is particularly important for babies and toddlers to recognize your attention through direct eye contact and to establish a good bond. When dad is constantly looking at his smartphone, it’s not possible. Children, even at a young age, notice this. The older children get, the more they emulate you. In toddlerhood, they reach for daddy’s smartphone or speak into a brick that has a similar shape. They realize early on how important this device is for adults or older siblings.

Be a good role model

Create a good basis for a reflective approach to media right from the start. Keep the smartphone on silent in your pocket when you play with your child, so that they don’t get the impression that the smartphone is always more important. Later, your child may behave the same way. If it does get pulled out, explain to your child why.

There are certainly moments when the smartphone is needed to take a nice photo of your offspring. Capture beautiful moments with the camera! But think about how often that has to be. After all, your child would rather look you in the eye than constantly at the smartphone in front of your face.

Spending time together with media is also part of family life. Introduce your child to media slowly and choose age-appropriate content. However, such media experiences should always alternate with media-free times.

In all of this, be aware of your role as a role model!

The constant argument about media time: At what point is it too much?

“Now put the phone down!” Many families hear sentences like this almost every day. Parents worry that their child is spending too much time on their smartphone, tablet or console – and wonder whether school, health or family time together is suffering as a result. But when does media use really become a problem?

Digital media are part of everyday life

Whether writing messages, watching videos, doing homework or simply relaxing – digital media are an integral part of our everyday lives. Children and young people also grow up with them as a matter of course. Media offer many opportunities: they promote creativity, connect across distances, help with learning or provide entertainment. They are a source of information, a leisure activity, a social meeting place and often also part of schoolwork.

Usage times have changed considerably in recent years: Children and young people spend significantly more time on screens than they used to. They are strongly oriented towards adults: If the smartphone is always at hand or the TV is on in the background, children automatically adopt these habits. They not only learn which media are important, but also how to use them.

When does media use become a problem?

Media are fascinating. They are colorful, exciting, offer variety – and there is always something new to discover. It can therefore be particularly difficult for children and young people to take breaks or stop at the right time.

Phases of intensive media use are completely normal – especially during puberty.

It is important to consider the child’s perspective: media use can be an expression of creativity, belonging or security – and often fulfills a real need. Media can help to reduce stress or maintain contact with peers. It becomes difficult when other areas of life are permanently neglected: Friends, school, hobbies, exercise or sleep.

Warning signs of problematic use

Pay attention to the following signals:

  • Your child is often irritable or restless after use
  • Withdraws or has concentration problems
  • Lack of sleep or mood swings occur
  • Media take center stage – other interests are permanently pushed into the background

In rarer cases, there may be a loss of control or significant neglect of other areas of life.

If such excessive behavior occurs over a longer period of time, talk to your child about possible changes – and get support if necessary.

It is important not to condemn or prohibit across the board, but to understand and support.

How much media time is okay?

There is no one-size-fits-all answer to the question of the perfect duration of use. Rather, age, stage of development, content and type of use as well as the daily routine and other activities are decisive. A learning video for school is to be assessed differently than endless scrolling through social media or constantly checking the news. Various guides provide recommendations on media time, but these are only rough guidelines. Instead of rigid time limits, what counts is a balanced daily routine in which children have a say. This builds confidence, promotes media literacy and helps them to better recognize their own needs and limits.

Tips for parents

Talk about the effectAsk your child how they feel after using it. Was it relaxing, exhausting, frustrating or inspiring? This teaches them to recognize their own limits.

Establish rules togetherInvolve your child in the decision. This will make them feel taken seriously.

Discuss content and goals: What does your child use the tablet or cell phone for? Is it for homework, playing, relaxing or keeping in touch with friends?

Use contracts or agreements: A media usage agreement can help to set out clear rules and expectations in writing.

Be a role modelReflect on your own use of media. Children learn through observation.

Encourage offline time: Create spaces without a screen – for exercise, playing, talking or cooking together.

Use technical aids: Time limits and age restrictions can be set for many devices, especially for younger children.

Talk about the effectAsk your child how they feel after using it. Was it relaxing, exhausting, frustrating or inspiring? This teaches them to recognize their own limits.

The Fediverse: The better alternative to Meta, TikTok & Co.

Facebook, Instagram, TikTok, YouTube: The best-known social networks mostly belong to a few large corporations, often US companies such as Meta, Google or the Chinese company ByteDance. There, algorithms, advertising and data collection determine what we see. But there are alternatives: the Fediverse, an association of decentralized networks that are usually more privacy-friendly, ad-free and sometimes even non-profit.

What is the Fediverse?

The term “Fediverse” is an artificial word made up of “federated” (networked) and “universe”. It refers to a network of independent platforms that are nevertheless connected to each other. This is made possible by a common technical standard called ActivityPub.

The Fediverse works a bit like e-mail: There are many providers, but you can still communicate with each other – without any central control. This ensures greater diversity, data protection and digital self-determination. For example: If you have an account with Mastodon (Twitter alternative), you can also interact with users on Pixelfed (Instagram alternative) or PeerTube (YouTube alternative).

The most important platforms at a glance

Mastodon – like Twitter, but independent

Mastodon is the best-known platform in the Fediverse. It is reminiscent of X (formerly Twitter): You write short posts, follow others and comment. Unlike X, however, there is no central company here, but many individual servers, operated by associations, initiatives or private individuals.
There is no advertising, no algorithms and data protection is paramount. It is a little strange to use at first because content is not automatically suggested. You build up your own network step by step.

Pixelfed – like Instagram, but without the meta

Pixelfed looks like Instagram – only without advertising, tracking or a company in the background. Post photos, share stories, like profiles: You can do all that here too. Many people use Pixelfed to showcase their travels, creative projects or everyday experiences – privacy-friendly and without the pressure to like.

PeerTube – like YouTube, but collaborative

PeerTube is a decentralized video platform. Users upload their videos to various servers, for example from educational institutions, media projects or activists. Instead of chasing clicks and ads, the focus here is on the content – without any tracking.

Funkwhale – Share music in the community

Funkwhale is a platform for sharing and streaming music. It is primarily aimed at independent artists, small labels or people who want to distribute podcasts or their own music fairly and collectively. Here, too, there is no advertising and no tracking of users.

Bluesky – exciting, but not (yet) part of the Fediverse

Bluesky was originally co-developed by Twitter and also aims to be a decentralized network. However, it is based on its own technical system (AT protocol) and is not directly connected to Mastodon or Pixelfed. Nevertheless, it is considered an exciting alternative – especially for former X users.

What you should consider

Fediverse shows that there is another way. Without personalized advertising, Like pressure or the constant fear of missing out on something. Instead, it offers more self-determination, diversity and data protection.

However, the platforms are often smaller. Some functions such as automatic recommendations or a large reach are missing and many friends are not yet active there. This is not(yet) very attractive for many children and young people. However, it is worth a look, especially for parents, teachers or people interested in media. Fediverse promotes digital maturity – and a more conscious, social interaction online.

If you want to get a taste of it, you can take a look at joinmastodon.org, pixelfed.org or joinpeertube.org, for example. Some German instances such as mastodon.social, chaos.social or pixelfed.de offer a quick introduction and orientation aid.

Meta AI – The AI assistant in WhatsApp, Instagram and Facebook

Meta AI is a new digital assistant from Meta, the company behind Facebook, Instagram and WhatsApp. Without a separate app, young people can now access artificial intelligence directly via chat or search – for learning, chatting or collecting ideas. The question for parents is: how does it work – and is it safe?

In brief

  • AI directly in popular apps(WhatsApp, Instagram, Facebook)
  • Minimum age according to GTC: 13 years (without age check)
  • Problematic: data protection, emotional attachment, misinformation
  • Free of charge, but Meta uses data for training, personalization and advertising

How does Meta AI work?

Meta AI has been officially available in Germany since March 2025. A blue circle with a sparkling pattern signals access to AI-supported chats – directly in WhatsApp, Instagram, Facebook or Messenger. A separate app is not required. The text-based assistant responds to questions, gives tips and suggests content. Meta AI is also integrated into the search bar on Instagram and Facebook. This can lead to users interacting with the AI unintentionally.

  • WhatsApp and Messenger: With “@MetaAI”, the AI can be activated in individual or group chats. It helps with planning, researching, texting and improving.
  • Instagram and Facebook: Questions can be asked directly via the search bar or as a direct message. The AI then suggests posts, hashtags or content – based on existing data and probabilities.
  • Restricted in Europe: Functions such as image generation or creative design tools are deactivated in the EU – due to the stricter requirements of the European AI Act.
  • No EU training data according to Meta: Meta assures that no private chats or personal data of minors from Europe will be used for AI training – provided that they correctly state their age and are recognized as minors.

What fascinates children and young people about it?

Many young people experience Meta AI as practical support in everyday life. The AI is available exactly where they are anyway – on WhatsApp or Instagram. It provides quick answers to school questions, helps with translations, writes texts or makes suggestions for posts and content. The AI is friendly, approachable and helpful, almost like a conversation partner.

Especially in comparison to more complex AI offerings such as ChatGPT or Perplexity AI – which specify sources or require special user knowledge – Meta AI has a much lower-threshold and more familiar effect because it appears directly in the familiar Allatgs apps. Many young people also try out Meta AI because they talk about it with their friends or share content.

What can be problematic?

  • MisinformationMeta AI often sounds convincing, but can give factually incorrect or one-sided answers. The tool is based on training data that contains biases or does not cover certain topics at all. In such cases, the AI may simply invent (“hallucinate”) content.
  • Data protectionMeta uses publicly available content from Facebook and Instagram as well as user behavior to improve AI – also in Europe.
  • PrivacyAccording to Meta, private chats and data of minors should not be used for AI training. It remains unclear what other data is actually used.
  • Opt-out required: If you do not want your own public posts to be used for AI training, you must actively object – via a web form on Facebook or Instagram. Important: The opt-out only applies to future content. Data that has already been used cannot be deleted retrospectively.
  • Emotional closeness: The AI imitates human conversations. Some young people might mistake them for real friends.
  • Problematic content: Despite protective filters, Meta AI may address sensitive or problematic content, for example on sexuality, eating behavior or psychological problems.
  • Cultural differences: Meta AI was predominantly trained with English-language content. Some answers do not fit well with the German or European context.
  • Data protection in the EU: data protection experts accuse Meta of circumventing European laws. Complaints to supervisory authorities are under review.
  • Personalized advertising: According to Meta, interactions with AI can also be used to personalize advertising.

What does the provider think?

Meta emphasizes that no private messages or data from minors in Europe are used for training. According to Meta, public content is accessed within the framework of applicable data protection laws. In the EU, users must actively object if they do not want their content to be used for AI training. However, data protectionists criticize the fact that there is no active consent.

What should parents pay attention to?

Parents can best protect and empower their children if they stay in conversation and reflect together on how AI works.

  • Start a conversation: Ask your child openly whether they use Meta AI, what they do with it and what they like about it. Show interest without being controlling.

A minimum age of 13 years is required to use Meta AI, or even 16 depending on the app, but there is no automatic age check. Talk to your child about these age limits too

  • Strengthen critical thinkingEncourage your child to question statements made by AI. Review content together and talk about uncertainties, mistakes and the right way to use it, i.e. also about do’s and don’ts when dealing with AI.
  • Check privacy settings together: Go through the settings with your child. Also think about the objection to AI training.
  • Deactivate or hide Meta AI: In some apps, the Meta AI icon can be muted or hidden. This can prevent unintentional or accidental use.
  • Emphasize human contact: Remind your child that AI does not replace real relationships. Feelings, worries or important topics belong in real conversations with family, friends, confidants or professional counselors.
Project partners
Supporter