MENU

What is Character AI? What parents need to know

A hand holds a smartphone that reads character.ai in front of a computer screen.

Character AI is a chatbot that users can have customised conversations with. With millions of users making use of the platform, there are concerns about its safety for children.

Learn what you can do to keep your teen safe.

What is Character AI?

Character AI is a chatbot service that can generate human-like text responses based on a user’s customisation.

Launched in 2022, users can create characters with customisable personalities and responses. They can publish these characters to the community for others to chat with or can chat with the character themselves.

The service uses artificial intelligence to create believable responses. It’s popular among children and young people due to the ability to customise characters. They can base them on existing people or characters in popular culture or create something new.

Minimum age requirements

According to Character AI’s Terms of Service, users must be 13 or older to use the service. EU citizens or residents must be 16.

If an underage child tries to register, they get a notification that there is a problem with their sign up. They then automatically return to the sign in page but cannot try again. However, there are no age verification features if they lie about their age.

However, Google Play lists the Character AI app as requiring ‘Parental Guidance’. The App Store from Apple lists the app as 17+.

How it works

Once registered, users can browse a range of chatbots for different purposes. This includes practising a language, brainstorming new ideas and playing games. The chatbots featured on the Discover page are relatively neutral when you first join. However, there is the option to search for specific words, which do not seem filtered in anyway.

Users can also create a character or voice.

Create a character

Users can customise their character’s profile picture, name, tagline, description as well as their greeting and voice. They can choose to keep the character private or share it publicly.

You can then ‘define’ the character. This is where you explain how you want them to talk or act, giving the chatbot character a ‘personality’.

You can also customise characters so that they respond in specific ways. So, if you say that the character is depressed, its responses will reflect that. Equally, if you say it is upbeat, it will reflect that.

The conversations are realistic, so it’s easy to feel immersed in the character’s worldview.

Creating a voice

To create a voice, users must upload a clear 10-15 second audio clip of the voice with no background noise. You can use shorter clips, but the voice might seem more robotic this way. To finish uploading, you must give the voice a name and tagline.

If you assign a character the voice you’ve uploaded, you can play the chatbot responses in that voice. It does a fairly accurate job of mimicking the uploaded voice.

Is Character AI safe?

Controversies

In October 2024, a teen took their life after interacting with a character they created. The conversations preceding their death related to suicide. In one example, the chatbot asked the teen if they had planned how they would end their life.

In the same month, an investigation found chatbots which mimicked Brianna Ghey and Molly Russell on the Character AI platform. Once alerted to them, the platform deleted the chatbots. The incident brought into question the process of creating character chatbots, especially when they mirror real people.

Prior to these incidents, others reported poor advice and biases against race and gender.

However, other users have credited the Character AI app with supporting their mental health.

Safety considerations

Like with any AI tool, Character AI learns from the people who interact with it. Unfortunately, this can lead to inappropriate and harmful content.

The app’s Terms of Service and Community Guidelines are similar to others and warn against such content. Additionally, users can report chatbots or characters for a range of reasons. The Community Guidelines encourage users to share specific examples of rule violations where possible.

Screenshot of reporting options on Character AI

While the app has its own moderation team, they cannot moderate direct messages between community members. This is because any such messaging takes place off-platform such as on Discord or Reddit. However, they encourage users to report rule violations in the app if they see any.

Safety for under-18s

Character AI announced a series of safety features for under-18s. They include:

  • changes in models that minors use. This includes reducing the chances of coming across inappropriate content;
  • greater response and intervention when users violate the Terms of Services of Community guidelines;
  • an added disclaimed to remind users that the chatbots are not real people;
  • notifications after an hour-long session on the platform.

Additionally, Character AI has removed characters flagged as violating rules. Users can no longer access chat histories involving these characters.

Character AI says that it takes the safety of its users seriously. Additionally, it is working towards additional safety features for all users.

This is the image for: GET YOUR DIGITAL TOOLKIT

Stay on top of apps and platforms

Help keep your family safe on the platforms they use with personalised advice to manage risks and harm.

GET YOUR DIGITAL TOOLKIT

5 tips to keep teens safe on Character AI

Regularly check-in on their use of the app

Which characters are they talking to and how? Do they have any potentially harmful personalities? These check-ins are a part of regular online safety and help children develop their understanding of harmful content.

Together, review the privacy and safety features

Go through the app together so you both know where and how to report harmful content. Also, review the Terms of Service and Community Guidelines to make sure your teen knows what they should report.

Consider their maturity as well as age

Not every teen has the critical thinking and media literacy skills to use Character AI. So, consider whether your child has the skills to manage potential harm before allowing access.

Use parental controls in the app store, on devices and on your network to restrict access to Character AI if you don’t want them to access it. However, remember to also talk about why you’re doing this.

Discuss the appropriate ways to use AI tools

They should not upload their own voice or images to protect their privacy. Equally, they should not upload the voices or images of friends, family members or other real people. An exception for voices might be if they like to voice act and want to upload a fake voice.

Set limits around usage

This could include setting time limits, defining where they can use the app and agreeing on how they can use the app. For example, you might limit them to an hour in a common area at home. You might also let them use it for learning or practising skills but don’t want them having casual conversations with chatbots.

Was this useful?
Tell us how we can improve it

Recent posts