What happens when students believe they are “bad” at learning?

QS Midweek Brief - December 17, 2025. How can we overcome past learning traumas to achieve future academic success?

What happens when students believe they are “bad” at learning?

Welcome to the final edition of the QS Midweek Brief for 2025. Thank you so much for joining us throughout the year as we unpacked more about the global higher ed sector.

In the final newsletter of the year, we shift our focus to what the sector is all about: students. Specifically, we look at what happens when students consider themselves bad and how to overcome it. We also look at the rise of Ai-enhanced applications to universities.

On behalf of the team, I wish you a very happy and pleasant end of year period and we look forward to see you in the new year.

Stay insightful,
Anton John Crace
Editor in Chief, QS Insights Magazine
QS Quacquarelli Symonds


Counting everybody in

By Claudia Civinini

The book The Celestine Prophecy is more readily associated with spiritual retreats and yoga mats than calculus.

But for Dr Geillan Aly, it was the gateway to a lifetime of mathematics.

Hungry for meaning, fresh out of university in the late 1990s and early 2000s, she stumbled upon those books that are not part of the syllabus, as she describes them, and read New Age philosophy for a while.

But seeking to investigate how much truth there was behind the spiritual notion of energy, she moved on to other books – into science, rather than more New Age.

Re-enrolling at university in her late 20s to learn physics and maths was the natural progression. However, she had to choose one as she couldn’t afford to study both.

“I thought: math is universal. That brought me back to thinking about high school, my teacher, and to realising that I had been using math to solve problems in my daily life and my jobs,” she recalls.

“And math is probably the closest we can get to understanding our universe. That brings us closer to God, our purpose, and everything else.”

Not a maths person

Dr Aly was born in Giza, Egypt, and migrated to the US with her family when she was less than a year old. “Education was something my parents valued tremendously. That’s why we came to the US,” she explains.

Growing up in Queens, New York, she remembers being bullied while she was going to public school. “From a very young age, I learned about how people can treat you in a way that makes you feel unwelcome,” she says. She also soon found out that maths was something she liked and was good at. In high school, she took AP calculus; Ms Pellegrino, her maths teacher, used to tell her she should pursue maths at university.

But at university she didn’t pick maths. She ended up majoring in history with a focus on the Middle East to learn more about her roots instead, and started working in the publishing industry.

When she left her career in her 30s to go back to university, while tutoring on the side to support herself, not everyone was on board with the idea.

“I once told somebody that I was quitting my job to go back to school and study math. And their response was – ‘Seriously? You are not a math person. What are you talking about? Do you even remember the quadratic equation?’,” she recalls.

That ‘you are not a maths person’ was an attitude she’d find more frequently than she would have liked.

And she did remember the quadratic equation. “Of course I did,” she says. “Ms Pellegrino taught it to me!”

Feeling awful at maths

A Mathematics Degree and a Master’s were, unsurprisingly, a challenge. Dr Aly recalls working extremely hard and passing her exams, but support and most importantly recognition were a mixed bag.

Sometimes, she’d get straight discouragement, she says.

“When I said ‘I really want to study mathematics, understand the nature of mathematics, go to graduate school, and get a PhD’, some people were like, ‘nope, nope’.

“Some people told me, ‘You are not going to fit in. The level of effort and fighting that you're going to put in is not worth it’,” she recalls.

As discouragement piled up, self-doubt crept in. Her mental health took a hit. She had gotten through a mathematics undergraduate degree while working, had completed a master’s and was working on her PhD, but she was feeling incapable – and “awful at maths”.

“I felt worthless. I felt suicidal. It was awful – all of my self-worth was tied up in all this and how well I was doing in it.”

Eventually, she started working with a professor in the College of Education, and that is where she completed her PhD, with a thesis on computer-centred maths learning.

Healing

A first glimpse of her future work came when she started teaching maths in a community college during graduate school, and she witnessed students walking into the first maths class already declaring they were going to fail.

“And I hadn’t even given them the syllabus yet!” she recalls.

“These kids walked in with the horror stories of their experiences. I heard my experiences a million times worse.”

She realised she wasn’t alone, and this eventually led her to work on maths trauma, which she defines as the previous experiences that contribute to students disengaging from mathematics.

“Open a door to equity and social justice. Open a door to learning about inequitable school funding based on zip code and income in the US. Open a door to biases around ‘people of colour don't do math’ and ‘women don't do math’,” she explains.

“Think about all the subconscious messaging that girls get about their abilities. Think about the perceived and implicit bias – all this opens a huge world.”

Maths trauma and maths anxiety have been highlighted in the literature and other work as the negative experiences, borne out of a mix of misconceptions, elitism and discrimination, including racism and sexism, that can impact students and cause negative feelings towards maths.

For example, the author of the 1978 book Overcoming Maths Anxiety, Sheila Tobias, wrote in an article, as reported by The Harvard Gazette, that maths anxiety “is a serious handicap.”

She wrote: “It is handed down from mother to daughter with father’s amused indulgence. (‘Your mother never could balance a checkbook,’ he says fondly.)”

There is evidence that girls tend to have higher levels of maths anxiety, as highlighted in UNESCO's 2024 Gender Report.

For Dr Aly, maths trauma can have a more pervasive impact than maths anxiety.

“Math trauma can take shape in a number of different ways, including math identity, thinking you're ‘not a math person’, or imposter syndrome. Even thinking you're ‘a faker’ and can't legitimately do math – for example, when students succeed in math and attribute that success to luck,” she explains.

A teacher turned education journalist, Claudia has been writing about international education for the best part of 10 years. Originally from Italy, she worked and studied in Australia before moving to the UK in 2014. As a journalist, she worked for publications in the education sphere such as EL Gazette, The PIE News and Tes, specialising in research and data-led reporting, before going freelance in 2021. She holds an MSc in Educational Neuroscience and is passionate about education research.


Meet your AI applicant

By Seb Murray

In Brief

  • Students are increasingly using generative AI to draft and refine application essays, but universities and detection software struggle to reliably identify its influence in the final writing.
  • Heavy AI reliance risks creating generic, dull essays, stripping away the authentic voice and originality sought by admissions officers. A sizable majority of institutions currently lack formal rules on AI use in the admissions process.
  • Because sophisticated AI detectors are not a lasting solution, universities must strengthen AI literacy for all stakeholders. Alternative paths include securing assessments or requiring applicants to disclose AI use.

This year, many would-be university students have found themselves in a new kind of collaboration: drafting essays with the help of generative AI. Admissions consultants and educators say a growing share now ask ChatGPT or similar tools to “make it sound more mature” or “tighten the grammar”.

Many do so without their institutions knowing. While hard data on AI use in admissions is scarce, broader research shows how these tools have become part of everyday student life. A 2025 survey by the Higher Education Policy Institute (HEPI) found that nearly nine in 10 undergraduates now use AI tools for assessments, up from about half the year before.

And many do so regularly: almost a quarter of students reported using it daily for various tasks, while more than half said they use it at least once a week, according to the Digital Education Council. This familiarity is likely to carry over into the admissions process.

For many university applicants, AI is simply the newest aid: a natural extension of the spellcheckers and grammar apps they already rely on.

“We often encourage checking AI for specific tasks such as proofreading, revisions and identifying word choices (e.g. synonyms), as long as the applicant independently reviews and validates the recommendations,” says Stacy Blackman, an admissions consultant who advises business-school candidates.

Her firm, Stacy Blackman Consulting, discourages using algorithms to shape personal stories. “When it comes to school-specific research, we find that AI misleads applicants toward outdated, impersonal and/or overly generic information that can be a dealbreaker in admissions,” notes Blackman.

At Dickinson College, a liberal arts institution in Pennsylvania, Seth Allen says AI use in applications is hard to spot, but is likely happening behind the scenes anyway.

“It’s rare to find an admissions application with writing that can be identified with high confidence as having originated through AI,” notes the Vice President of Enrollment Management; “though just because we can’t see it in the finished writing, doesn’t mean it is not being used to brainstorm, refine ideas [or] develop approaches.”

Essays may look authentic even when algorithms have shaped their tone and structure. For admissions offices, that makes it hard to see where the applicant ends and the software begins.

Rene Kizilcec, Associate Professor of Information Science at Cornell University, has tested this growing overlap closely. His study published in March compared 30,000 human-written admissions essays with AI-generated ones – including some produced with added demographic details about the applicants.

The result was that the machine-made essays still sounded different from the human ones, and giving the AI extra background details about the applicants made little difference to how natural or individual the writing seemed.

Professor Kizilcec doubts that software designed to detect AI-written text will work accurately. “AI detectors are not generally reliable, and what’s worse, they are known to be biased against non-native English writers,” he tells QS Insights. “As soon as human and AI writing gets mixed, it gets more complicated to draw a clear line.”

He adds that admissions officers themselves are often overconfident about spotting AI use. “They think they know but they really don’t; [their accuracy is] only slightly better than chance. That, again, is concerning if readers have a bias against AI, and they can’t even tell.’”

Universities are taking mixed approaches to AI; some are drafting formal rules on its use in applications, others are waiting to see how the technology evolves before deciding whether new policies are needed.

Some are waiting for sector-wide standards, wary of writing policies that could date within months as the tech continues to evolve.

Stacey Koprince, Director of Content and Curriculum at Manhattan Prep, says: “A sizable majority of programmes we spoke with still don’t have formal rules on how applicants can use AI in the admissions process… For now, it remains a bit of a ‘wild, wild west’ with many candidates essentially calling their own shots.”

The company offers prep for admissions tests, and its advice to prospective students is simple: “We strongly encourage applicants to check each school’s individual guidelines; no one wants an application tossed over an avoidable misunderstanding,” Koprince says.

Supporters see AI as a leveller, helping students with fewer resources compete on more equal terms; critics see it as a new divide between applicants who have the knowledge and support to use these tools well, and those left behind without them.

“LLMs can help students who don’t have great resources write better essays,” Cornell University’s Professor Kizilcec says. “It can also make anyone’s essay sound generic and dull.”

Admissions officers share this concern: that overreliance on AI can strip away the personal voice and originality they look for in an application. “Many business school admissions officers have told us that they view generative AI as a valuable tool – when it’s used to support and elevate an applicant’s own work,” says Koprince at Manhattan Prep. “The key, they say, is for AI to enhance the applicant’s authentic voice, not replace it.”

Other universities take a harder line, arguing that using AI to write an entire essay, for example, crosses an ethical boundary; Koprince herself calls that a “misuse” of the tech.

“Fully AI-generated essays often read as generic and detached. That lack of genuine voice is exactly what admissions officers are on alert for,” she adds. “And they really can spot it.”

Seb Murray is a journalist and editor who writes often for the Financial Times and has written for The Times, The Guardian, The Economist, The Evening Standard and BBC Worklife. He focuses on higher education and global business. He also produces a wide range of content for a range of corporate and academic institutions. Seb is also a recognised expert on higher education and speaks at international conferences.