CHATGPT
|
MINIMUM AGE:
Under 13: Not allowed to use ChatGPT 13–17: Allowed with parent/guardian permission 18+: Can use it independently WHAT IS IT? ChatGPT is an artificial intelligence (AI) chatbot developed by OpenAI that can understand and generate human-like text. It is designed to answer questions, explain information, help with writing, learning, and problem-solving, and have conversations with users. People use ChatGPT for a wide range of tasks, such as researching topics, getting help with schoolwork, generating ideas, or learning new skills. It works by analysing large amounts of text data to predict and produce helpful responses, but it does not think or feel like a human and may sometimes make mistakes, so information should be checked with reliable sources. WHAT WE SAY: Treat with caution. 🎓 How People Use It ChatGPT can be used on a website or mobile app and is often used by students, professionals, and everyday users. It can help explain difficult topics, assist with brainstorming ideas, summarise information, or support learning and productivity tasks. 📢 Age Restrictions The minimum age to use ChatGPT is 13 years old. Users under 18 should have permission from a parent or guardian before using the platform. 🔐 Privacy & Safety ChatGPT is designed to answer questions and provide information, but users should avoid sharing personal information, such as their full name, address, school, or passwords when using AI tools. ⚠️ Accuracy & Critical Thinking Although ChatGPT can be helpful, it does not always provide perfect or fully accurate information. It is important to encourage children to double-check facts using reliable sources and use AI as a learning tool rather than relying on it completely. 🤝 Parasocial Relationships With AI One emerging concern with AI chatbots is the potential for parasocial relationships. This is when a person forms a one-sided emotional connection with a digital character, influencer, or AI system. Because AI tools like chatbots can respond in a friendly, conversational way, some users - especially younger ones - may begin to feel as though the AI is a real friend or someone who understands them personally. ⚠️ Why This Can Be a Risk While AI can be helpful for learning or asking questions, it is important to remember that AI is not a real person and does not have feelings, opinions, or personal experiences. Over-reliance on AI for emotional support or companionship could reduce opportunities for healthy social interaction with real friends, family, or trusted adults. 👨👩👧 Guidance for Parents Parents can help by encouraging children to view AI as a tool for learning and creativity rather than a replacement for human relationships. Talking openly about how AI works, setting healthy boundaries for screen time, and encouraging offline activities and social connections can help ensure children use AI in a balanced and responsible way. |
SAFETY TIPS:
Explore the Internet Matters Parental Controls Guide for ChatGPT here. 📲 Learn How AI Tools Work Take time to explore how AI chatbots like ChatGPT work. Understanding what the tool can and cannot do will help you guide your child in using it safely and responsibly. 💬 Talk About What AI Is (and Isn’t) Explain that AI can sound conversational but is not a real person and does not have feelings or personal experiences. Helping children understand this can reduce the risk of forming unhealthy emotional attachments. 🔐 Avoid Sharing Personal Information Remind children not to share personal details such as their full name, address, school, passwords, or private information when using AI tools or other online platforms. 🧠 Encourage Critical Thinking Teach children that AI responses may not always be accurate or complete. Encourage them to double-check important information using trusted sources. ⏱️ Promote Balanced Screen Time Encourage children to use AI tools in moderation and balance screen time with offline activities such as hobbies, sports, and spending time with family and friends. 👨👩👧 Keep Communication Open Talk regularly with your child about how they are using AI tools and what they are asking. This helps build trust and makes it easier for them to ask questions or share concerns. 👀 Supervise Younger Users If younger children are using AI tools, consider supervising their use or using devices in shared family spaces to help monitor activity naturally. ⚠️ Watch for Over-Reliance Encourage children to use AI as a learning aid rather than relying on it to complete homework or make decisions for them. Building independent thinking and problem-solving skills remains important. FURTHER SUPPORT: For parents/ carers If you're worried about your child or need support call NSPCC helpline on 0808 800 5000. If you see suspected child sexual abuse content online, report it to the police or the Internet Watch Foundation (IWF), which helps remove illegal images from the internet. If you’re worried about online sexual exploitation or abuse, you can make a report to Child Exploitation and Online Protection Command (CEOP) - a UK law enforcement team dedicated to protecting children and young people online. For Children & young people if they're worried or want to talk, encourage them to contact Childline Online or call 0800 1111. Check out our Factsheets for further information and useful online safety tips. |