QS MidWeek Brief - July 23, 2025
AI usage inside and outside the classroom remains ambiguous for many students, and a hard and fast rule doesn’t seem likely to solve those challenges.
Welcome! Last week, I spoke to my experiences, but this week I want focus entirely on current day students and some of the challenges they face. There’s a lot.
AI usage inside and outside the classroom remains ambiguous for many students, and a hard and fast rule doesn’t seem likely to solve those challenges. Meanwhile, we meet up with some US-based students to find out how they’re spending their summer break. Many are very concerned about planning for their post-graduation jobs, while others have more immediate concerns, such as housing security.
Stay insightful,
Anton John Crace
Disclosure or Disarray?
By Gauri Kohli
In Brief
- Universities face confusion and low student compliance with AI disclosure policies.
- Defining legitimate AI use versus academic dishonesty remains a significant challenge.
- True academic integrity in the AI era requires systemic pedagogical redesign, not just declarations.
The widely debated question of what constitutes original student work is being redefined across universities, leading many to explore a new frontier: AI disclosure. At some universities, the focus is not only on detecting AI use but also on encouraging transparency and responsible use. However, early signs show confusion, inconsistency and growing ethical concerns.
This is the ethical grey area universities are grappling with, where the lines between assistance and academic dishonesty are becoming harder to define.
Policies and Early Challenges
The core of this evolving strategy is compelling students to explicitly declare and explain how AI tools assisted their assignments. This typically involves detailing which tools were used, what prompts were entered, how the output was edited and whether the student verified the results. Universities like Princeton and Georgetown in the US are following this approach, requiring detailed explanations of AI integration in student submissions. Similarly, the University of Melbourne asks students to defer to their lecturers’ guidance on acceptable use of AI (if at all) and provide written acknowledgment of the use of generative AI and its extent.
However, the implementation of these policies has been anything but smooth. Dr Chahna Gonsalves, a Senior Lecturer in Marketing (Education) at King’s Business School, King’s College London, conducted a study in 2024 that revealed a significant hurdle: 74 percent of students who used AI tools did not declare this use, despite being prompted by mandatory forms. This stark figure points to a disconnect between institutional intent and student behavior.
“At the time, around 2023, we simply did not have clarity on how students were using AI tools to complete assessments,” she explains, highlighting that the initial declaration process was an “exploratory step toward cultivating a culture of transparency”, rather than a definitive solution.
“Disclosure can encourage transparency and critical reflection, but it is not a viable long-term strategy for ensuring academic integrity,” she tells QS Insights Magazine.
This low uptake, as Marc Watkins, Assistant Director of Academic Innovation at the Mississippi AI Institute for Teachers, suggests, could indicate a “communication gap or deeper issues of trust or clarity”. Students “respond favourably and recognise the importance of what is being asked” when given a clear pathway to disclose usage, he notes. Yet, in practice, as Dr Gonsalves’s research shows, a significant portion of students are either unaware, uncertain or hesitant to comply.
Professor Jeannie Marie Paterson, Director of the Centre for AI and Digital Ethics at the University of Melbourne, echoes this sentiment, stating that universities currently have “very little understanding about the overall use by students of generative AI and how they are using that technology”. For her, disclosure serves as an early response, offering “some insight to lecturers about student use” and acting as “a prompt to students to reflect on their own practices, and perhaps some friction in overuse.”
The Ethical Grey Zone
Experts say that one of the most pressing challenges arising from the use of AI in education is defining the ethical boundaries of collaboration. “Students and faculty increasingly don’t have shared experiences using this technology,” says Watkins. “Faculty resist AI while students embrace AI. We need to develop a more nuanced understanding of how and when this technology is employed in order to get to a place where AI best practices can arise. We aren’t there yet and won’t be for quite a while.”
“It is increasingly difficult to draw the line between legitimate and illegitimate uses,” observes Professor Paterson, noting that AI is “now embedded in operating systems and search engines”. She highlights that there is a normative and practical difference between authentic output and AI generated work. “The problem for universities is that it is very difficult to identify AI generated work with confidence,” she adds.
Dr Gonsalves argues that instead of focusing on “tiered disclosure”, universities should adopt “tiered use instructions”, clearly communicating the acceptable scope of AI for each task.
The lack of consistent guidelines creates a navigation nightmare for students, a point several academics repeatedly flagged during interviews. “In the US, most universities allow individual faculty freedom to decide what course-level AI policies should be,” explains Watkins. “This is challenging for students to navigate as different professors will have conflicting views about AI and academic integrity along with different levels of understanding about what AI can do.” This inconsistency, while allowing for flexibility, simultaneously leads to confusion and raises concerns about fairness.
Over-regulation, a Risk?
While the impulse to regulate AI use is understandable, experts caution against overly rigid or unclear policies that could inadvertently penalise students and stifle innovation. Dr Gonsalves highlights the significant risks for student equity, particularly for “students from non-traditional backgrounds, those with learning differences, or those working in a second language often rely on AI tools for support in ways that are legitimate and necessary”. Without clear distinctions between legitimate support and unacceptable substitution, these students risk being unfairly penalised.
Moreover, overly strict AI bans could have a “chilling effect on experimentation and curiosity”. In today’s professional landscape, AI is already deeply integrated into numerous fields. Prohibiting or discouraging its use in educational settings might leave students ill-prepared for the realities of the workforce. “As a marketing educator, I see an additional risk in that overly restrictive policies may actively undermine the type of innovation we seek to encourage,” says Dr Gonsalves.
Professor Paterson views disclosure as an “educational response” to both staff and students about generative AI use. In practice, disclosure helps clarify what’s acceptable for students. It may also help staff have better insight into how well students understand academic integrity policies on AI use.
She argues that disclosure “should not be the basis, or at least the sole basis, for penalising students who have used AI in ways that contravene university policies”. If it were, dishonest students who choose not to disclose would be at an advantage over those who are transparent.
Find out what’s working and how standardisation can be implement online.
Gauri Kohli specialises in writing and reporting on higher education news, including analysis on higher education trends, policies and the edtech sector. Her writing focuses on international education, study abroad, student recruitment trends and policies, with focus on India as a market. She has also covered workplace and hiring trends, corporate practices, work-life features, startup trends and developments, real estate for leading publications and media houses in India and abroad for the last 18 years, including Hindustan Times, a leading national daily newspaper in India.
Day in the Life of Students in Summer 2025
By Julie Hoeflinger
In brief
- Students are juggling multiple responsibilities to manage financial pressures and build their resumes in a competitive job market.
- The oversaturated job market and rising housing costs are significant stressors, forcing students to make difficult choices about their education and living arrangements.
- While students find ways to create moments of joy amidst challenges, collective efforts are needed to improve conditions for young people.
“Every summer, I have to work a part-time job, considering I come from a lower-middle class family,” says Jenelle, a third-year student at Fordham University in New York.
“This summer, I applied to be a Joy Creator at Nothing Bundt Cakes. I’m also taking a summer course in Introduction to Theatre as a course requirement at a local transient college, and I’m doing research with a professor in Computational Neuroscience.”
On top of having to work internships, take classes and oftentimes squeeze in volunteer hours, many internships are unpaid, forcing students to work another job on top of prior commitments. But even in describing the variety of demands placed on her, Jenelle spoke with great enthusiasm, explaining that one of her new year’s resolutions this year was to face her social anxiety by making herself as “uncomfortable as possible”, in this case, intentionally choosing a customer service job, committing to overcoming her fears.
Each of the three US-based students I spoke to reported a combination of summer classes, internships and additional jobs to pay bills. Maggie, a rising sophomore at the University of Tennessee (UT), shared what her summer looks like this year. “I’m studying public affairs, which is what led me to my current job, this internship that I have right now [with] the UT’s Institute for Public Service,” she explains, not failing to communicate her passion for learning about local government.
Maggie also works at a local insurance agency, a role she's held for three and a half years. “That job has just been great for me because… first, it pays well,” she chuckles candidly. “And I can work from home, and it's [relatively] flexible, so I'm able to work remotely from anywhere, [allowing me to] continue the job during school.”
Meanwhile, Faith from the University of Michigan is headed into her final year of a Human-Centered Engineering Design degree. After the completion of a summer class in the first half of summer, she’s now working four days a week at a quality engineering internship with an automotive parts manufacturer.
“I spend over an hour driving between home and work each day, and my favorite way to pass the time is singing loudly in the car. In my free time, I’m hanging out with loved ones, working with my classmates to file a provisional patent for a class project, getting ready for the upcoming school year as the Public Relations Coordinator for my local chapter, and soaking up the sun.”
Faith’s schedule revealed a striking truth: free time isn’t exactly free time. Do students genuinely have the time or energy for personal hobbies mid all the commitments?
“Most of my summer’s spent working full-time,” Maggie explains, “but when I do have time to do other things, I try to get dinner with my friends or family, and I love to read and get outside and be active. But it's hard when I'm working 8-5 every day,” she says with a laugh.
Meanwhile, Jenelle says her favourite hobbies in summer are running, reading outside and “staying up to date with politics”. Many students, even those outside of these interviews, have also mentioned politics at one point or another.
Find out about housing challenges and other obstacles students are trying to overcome in the online magazine.

Julia been working as a content creator in the higher education sphere for over six years, with work including long-form journalism, news articles, branded content, and hosting/moderating in-person and online events. With clients spanning the globe, she’s been privileged to meet and interview some incredibly interesting people and cover a huge range of topics, from the emergence of AI to the impact of political legislation on the HE sector.