Toys and objects connected to conversational assistants threaten mental health of children and adolescents
Press release from the French National Academy of Medicine
November 28, 2025
Many toys and objects connected to conversational agents, based on spoken or written language (chatbots), are available on the market. Generative artificial intelligence (AI) can construct sentences and dialogues as well as humans. Early use by children and adolescents, who are naturally inclined to imagine and build relationships with others, leads them to attribute human qualities to these objects.
The link that develops between a child or teenager and a toy or object connected to a conversational assistant – via a tablet or computer – can lead to excessive trust, sometimes greater than that placed in parental advice. This trust is reinforced by a sense of familiarity and a shared language, nurtured by an environment perceived as benevolent, as well as by the illusion of a reciprocal relationship, amplified by algorithms.
Beyond excessive trust, there is a risk of dependence on children or adolescents, or of problematic misuse, exacerbated by several factors:
– an excessive personalization, likely to foster a feeling of friendship.
– the confidence with which AI provides answers without apparent effort, that can hinder children’s and adolescents’ judgement, fueling a denial of uncertainty and unrecognized errors (1-3);
– the transmission, by the managers of this software, of unverified information that escapes parental vigilance.
Due to the overall danger, with its consequences, that these elements represent both for children and adolescents, the French Academy of Medicine:
Warns about:
– the high risk of cognitive and emotional dependence of children and adolescents on the power of algorithms;
– the risk of substitution other rules to educational and family reference points;
– the risk of using chatbots as a privileged confidant, which can lead to the reinforcement of anxious or morbid thoughts, or risky behaviors (restrictive diets, self-harming acts, or even suicidal behaviour).
Warns, more particularly:
Parents who would be inclined to offer toys and connected objects equipped with conversational assistants that, according to the regulation recently adopted by the European Union (4), these objects are recognized as high-risk systems able to interact directly with children’s cognitive abilities and therefore require an increased supervision.
Recommendations:
For children:
– No use of digital devices before the age of 3;
– A ban on connected speakers and toys before the age of 6;
– Beyond the age of 6, systematic parental supervision.
For teenagers:
– No access to conversational robots/avatars* before the age of 12 and from that moment subject to strict parental control;
– Prohibit access to digital companions** before the age of 18, due to an increased risk of psychological fragility, including suicidal tendencies, as it requires discernment and critical thinking skills. The power of these devices is such that their use in the treatment of psychiatric illnesses is being considered (5).
For the relevant health authorities:
– Access to chatbots and digital companions should be subject to regulatory measures regarding their use by children and adolescents, as recently recommended by WHO (6).
References
1. Frances A., OpenAI Finally Admits ChatGPT Causes Psychiatric Harm. Psychiatric Times. 26 August 2025. (https://www.psychiatrictimes.com/view/openai-finally-admits-chatgpt-causes-psychiatric-harm, accessed 20 September 2025).
2. Tisseron S., L’Emprise insidieuse des machines parlantes, plus jamais seul (The insidious influence of talking machines: never alone again), Les Liens qui libèrent. (2020).
3. Sycophancy in GPT-4o: what happened and what we’re doing about it. OpenAI. 29 April 2025. (https://openai.com/index/sycophancy-in-gpt-4o; accessed on 11 August 2025).
4. Regulation (EU) 2024/1689, Articles 5(1)(b) and 6(2), Annex III § 3 – Official Journal L 2024/1689 of 12 July 2024.
5. Cohen D., Chetouani M., Anzalone S. Jeux sérieux et robotiques d’accompagnement dans les troubles neurodéveloppementaux: sont-ils déjà des outils thérapeutiques ? (Serious games and assistive robotics in neurodevelopmental disorders: are they already therapeutic tools?) Bull Acad Natl Med, 2025 (in press).
6. WHO Regional Office for Europe, Addressing the digital determinants of youth mental health and well-being: policy brief, Copenhagen, 2025, p. 9
Glossary
* Conversational robots are software that inform and help punctually without establishing an emotional connection (“what is the weather like today?”)
** The digital companions simulate an emotional relationship by soliciting the interlocutor and remembering their preferences
Bull Acad Natl Med 2025;209:pp-pp. [En ligne] Disponible sur : URL
