Will AI stifle our creativity?
The widespread adoption of artificial intelligence (AI) has made information access incredibly easy. However, behind this convenience lies a hidden concern: it could make people "lazy." Recently, Stanford University hosted the world's first AI-led academic conference, aiming to test AI's ability to independently generate scientific insights. This event, however, brought these concerns to the forefront—AI's potential for innovation in scientific research is promising, but it also carries the risk of accelerating the "degeneration" of human thinking and creativity.
![]()
"This concern is not unfounded!" said Li Bohan, associate professor at the School of Artificial Intelligence, Nanjing University of Aeronautics and Astronautics. Research from MIT has found that people who rely on AI for a long time will have fewer neural connections in their brains and their critical thinking will decline sharply, as if they are covered by an "AI brain fog".
There are many similar examples in life: people copy AI's answers directly when writing papers, and make decisions based entirely on AI's advice. Over time, many people feel that their brains have become mere decorations, and their thinking has been completely "outsourced" to AI.
"Fortunately, current AI isn't omnipotent. Its core capability is simply to break down and recombine existing data and knowledge, at most pushing things forward within the existing framework. It won't suddenly come up with an 'illogical' inspiration like a human, achieving a leap in thinking," said Li Bohan. He added that the world-changing breakthroughs in scientific history were precisely those involving such leaps in thinking. For AI that relies entirely on data, this is a formidable hurdle.
In other words, AI is at best a helper for researchers, and cannot become a "thinking outsourcing".
Wang Peng, associate researcher and big data planner at the Beijing Academy of Social Sciences, believes that the question of AI entering the scientific research field has never been "whether to use it," but rather "how to use it." In routine scientific research tasks such as deciphering protein folding and finding new materials, AI has already demonstrated its ability to accelerate research. "What we really need to be wary of is the problem of ordinary people's over-reliance on AI."
Imagine if AI could do everything for us—writing, drawing, composing music—wouldn't we become too lazy to do it ourselves and quietly give up the joy of creation? If students rely entirely on AI for their homework, wouldn't their ability to think independently deteriorate over time?
"Of course, it would be too pessimistic to regard AI as a 'killer' of creativity based on this alone," Wang Peng said. He pointed out that the advent of photography did not end the art of painting, but rather gave rise to Impressionism and modern art; the widespread use of calculators did not lead to the decline of mathematics, but rather promoted deeper mathematical research. AI is similar; it can change the way creation is done, but it will not take away the core of creation.
So, how exactly should we get along with AI?
Experts suggest establishing new cognitive habits: when encountering complex problems, first come up with an initial idea on your own, and then use AI to help expand and deepen it; when you see the answer provided by AI, don't use it directly, but ask "why" and "are there any loopholes" as if you were debating with someone; occasionally you also need to "detoxify" – put down your phone and AI, and give yourself time to think independently.
"The right way to use AI is to treat its answers as a 'first draft,' not a 'final draft,'" Li Bohan believes. In this upheaval of human-machine co-creation, the key lies in how humans choose to act. By holding onto control, AI will be our "wings," while abandoning independent thinking will turn it into a "cage" of thought.
