Beware of AI's adverse effects on students, lawmakers warn
The impact of over-reliance on artificial intelligence on students should be monitored closely as it may harm the healthy development of their thinking and social abilities, many national lawmakers have said.
A large number of students use AI to ghostwrite homework, blindly trust AI, and some have even become addicted to AI chat, a phenomenon that should concern parents, schools and government, they said, calling on strengthened efforts to maintain correct, effective, and safe use of the dominant technological tool.
"Currently, the abuse of AI among college students is becoming increasingly common," said Gou Xinglong, vice-president of Sichuan University of Arts and Science, and a deputy to the 14th National People's Congress, adding that students tend to use AI to complete academic assignments, including complicated and practical ones such as writing essays, experimental reports, survey reports, programming, designing, and making videos.
"Some students rely on AI to find answers immediately whenever problems arise. It appears that you have acquired a lot of information, but it lacks in-depth understanding and independent construction of knowledge. Students do not truly go through the process of thinking, exploration, and trial and error. Without AI assistance, they struggle to solve problems independently," he said.
Gou said this trend will weaken students' capacity for independent analysis and problem-solving, critical thinking, and creativity, "making it difficult for them to adapt to real-world work after graduation", adding that it also breeds academic misconduct and further undermines social fairness and justice.
Given that the authenticity, accuracy, reliability, and completeness of information generated by AI remain to be verified and it may even contain factual errors and misleading values, Gou suggested treating AI as a "thinking scaffold".
"When faced with problems, form your framework independently first, then use AI for verification or expansion; write a first draft before using AI to polish your assignments. Never copy and paste directly to avoid developing intellectual inertia."
The idea echoes with Wang Xiaomei, an NPC deputy and vice-president of Qingshen Middle School in Meishan, Sichuan province. Her school has more than 2,000 students aged 12 to 18.
"It's not uncommon to see children handing over a composition far beyond their level. Since it's very good and without personal perspective, teachers can quickly know it was written by AI," she said.
Though the school does not ban the use of AI, it discourages using AI to do homework, especially for students at a young age of learning.
"It takes time to remember pinyin and recite the nine-nine multiplication table. Such efforts cannot be saved because basic knowledge lays the academic foundation," she said. "After all, they cannot count on AI at high school and college entrance examinations."
Wang added that reliance on AI stems from laziness, and both teachers and parents should tell children that AI is only a tool to provide assistive services.
"Parents need to limit the time children spend using AI. And the nation should have a complete system to govern the application of this technology in schools. Limits should also be imposed on the content of AI services; their application should not be expanded without restriction," she said.
Gou said abuse of AI can also result in social alienation, referring to examples of young people seeking emotional support from AI, or even developing romance with AI "boyfriends" or "girlfriends".
"The 'bottomless pandering' nature of AI can easily foster virtual emotional dependence, tempting students to replace real-life social interactions with virtual ones," he said.
"This not only leads to declining communication skills but also makes them apply the 'algorithmic logic of efficiency' to interpersonal relationships. They pursue instant feedback and cannot accept real frictions and deep emotions, thus intensifying loneliness and social anxiety," Guo added.
He suggested setting daily time limits to prevent digital addiction, and forcing oneself to engage in pure paper-based thinking to avoid psychological dependence.
He also suggested sparing time for offline activities, practicing empathy and expression through real interaction, and maintaining healthy social skills.
Commenting on the case of a 14-year-old boy in Florida who became obsessed with his AI girlfriend, was isolated from the outside world, and finally was induced to end his life, Zhu Shan, another NPC deputy and director of Guizhou Gui Da Law Firm, said that insurmountable technical red lines must be set to avoid AI engaging in unethical conduct.
"No matter how much investment has been put to make AI resemble humans, it is not human, and does not have human emotions. In algorithm design, the principle of human priority should be written into the algorithmic program as a fundamental rule, and as a bottom line, it should by no means harm humans either physically or psychologically," he said.
Besides, advanced research and judgment should be conducted in terms of legal rules, along with the development of technological innovation, to define ethical boundaries between humans and AI, he added.
chenmeiling@chinadaily.com.cn































