With apps like ChatGPT becoming a key part of the way that we use the internet, it makes sense that young people are also finding uses for this technology. If your child has access to the internet, it might be time to chat about AI.
Why speak to your children about AI?
Children have a much closer relationship with technology than adults did when they were growing up, and as such they’re learning to use it at a much younger age. Young people are learning to code at an age when many adults didn’t even have access to the internet; their school lives and their personal lives have always been online.
It makes sense to speak to children about technology before they’re given a smartphone; and it also makes sense to speak to your children about AI before they find their way to it themselves. Children are exposed to a number of different applications, tools and programs, and they’re fast learners - they start testing the limits of this technology very quickly. Without parental guidance, young people are at risk of harming themselves, or others.
1. AI is really good at telling you what you want to hear
One of the reasons applications like ChatGPT have been so popular is because you can create pieces of writing that are written well. As such, many children, young people and teenagers have started using ChatGPT to write their assignments. However, whilst ChatGPT is excellent at writing something that looks and sounds correct, it is less concerned with creating something that is factually correct.
When it comes to references, sources, maths, and any other facts you’re better off using your own brain (or a calculator) than you are AI. Applications like ChatGPT will happily create fake articles, studies and documents to support its arguments, and create fiction which is presented as fact. Parents should encourage children to think critically about any material they read on the internet, particularly content created by AI.
2. Everyone is using AI
Another reason that AI shouldn’t finish your homework for you, is because AI could be finishing everyone’s homework for them. One of the easiest ways to identify that an assignment is AI produced is when multiple students hand a teacher identical pieces of work.
Platforms like Turnitin scan a document for plagiarism and AI written content using their own AI technology. Although the software is not always accurate, it does mean that if children move onto further education an AI written assignment is likely to be caught. It’s important for children to understand that this can adversely affect their education and qualifications.
3. Chats with AI aren’t private
Although being in a conversation with ChatGPT might feel like you’re communicating to a real person, you’re actually speaking to a collection of information compiled from a huge database; and you’re also helping that database to learn. Information that you give to ChatGPT isn’t confidential; all of your messages go back to the database so that it can give the next person that asks a question an even better answer.
Parents should encourage children to speak to trusted adults when they need to confide in someone, not an AI chatbot. This isn’t just about protecting their personal information and data, it’s also about ensuring that children who need help get the support that they need from a real adult who can keep them safe.
What can we use AI for?
Just because AI has its limitations, it doesn’t mean that we should stop children from accessing it altogether. Apps like ChatGPT can be used to help explain complicated subjects to children, create schedules and generate ideas. Just like other apps on the internet, ChatGPT can be a great tool - when it is used responsibly.