- Advertisement -

How to shift AI from a shortcut to a learning partner

Date:

Share post:

Rudy Gonzalez
Rudy Gonzalez
Rudy Gonzalez is the managing director of higher education at Unisys.

Artificial intelligence is rapidly transforming higher education, from how students research and prepare assignments to how they study for exams. In 2024, 86% of students reported using AI in their studies, and 54% incorporated it into their work every week.

However, as AI becomes a larger part of education, it may impact critical thinking skills.

Recently, a 2025 MIT Media Lab study found that students who extensively used AI tools like ChatGPT showed weaker critical thinking, problem-solving and memory development. This impacts how students learn and could make them less attractive to future employers who expect graduates to use AI while retaining these critical skills effectively.

This creates a significant challenge for universities: how can they integrate AI in ways that support learning rather than replace it?

The solution is not simply adding AI tools to coursework. Instead, institutions need a clear strategy, strong governance and ongoing faculty development to guide how AI is used in the classroom.

Higher education leaders must also foster a culture of change, one that guides students, faculty, and staff to embrace the transformative power of AI over its perceived threats or challenges.

Do you have a chief AI officer?

The first step in building a strategic framework involves establishing dedicated leadership to oversee AI implementation and use across campus.

Over the past year, organizations have begun appointing their first chief AI officers, tasked with addressing many challenges that AI can pose for higher education, including managing ethical concerns, faculty unease and student resources. This role oversees AI initiatives and coordinates between departments to align AI applications with educational objectives.

More importantly, leaders in this role monitor the impact AI is having on student learning, adjust practices as needed, and ensure that adoption is equitable across disciplines and student populations. In practice, these leadership structures are already helping universities like George Mason provide more precise guidance for faculty and improve consistency in AI adoption.

Structured governance like this helps maintain academic integrity, fosters trust among faculty and students, and ensures that AI enhances rather than undermines the learning experience. Having a central AI leader or team helps universities apply AI consistently and gives everyone clear guidance on using it responsibly.

A thoughtful engagement with technology

Educators are key to bringing AI into the classroom, but many are still learning to use these tools. To facilitate in this matter, colleges and universities need to adopt a use-case driven approach to identify appropriate AI tools.

Training programs can give faculty hands-on experience, help them recognize errors or biases in AI outputs, and help them design assignments that strengthen students’ critical thinking alongside AI.

Through specialized training programs, instructors can better identify the best ways to use AI in their courses. Moreover, they can demonstrate how AI can support student learning without replacing essential teaching practices.

Training and development can take many forms. Peer mentoring programs and cross-departmental collaborations can support faculty by allowing instructors to share strategies, troubleshoot challenges and develop consistent approaches to AI use across courses.

These tactics enable well-trained teachers to help students develop necessary judgment alongside technical skills. Over time, this cultivates a campus-wide culture of thoughtful engagement with technology, where AI enhances learning rather than shortcuts it.

Learning to harness technology

Like any transformative technology, AI disrupts the classroom. Students’ use of AI has raised legitimate concerns about academic integrity.

Universities can’t rely solely on detection software to address these challenges. Instead, they need a dual approach: establishing clear policies on AI use while redesigning assessments to minimize opportunities for misuse.

To help mitigate misuse of AI and encourage critical thinking skills, coursework should give students ways to demonstrate understanding that AI cannot replace. Educators can accomplish this through presentations, collaborative projects, real-world simulations or iterative assignments that require critical thinking.

Universities can also emphasize the learning process, not just the final product. Encouraging students to explain their reasoning and reflect on AI-assisted works fosters deeper engagement.

Peer review and group collaboration further reinforce accountability while allowing students to learn from one another. This approach ensures that students graduate with technical proficiency and critical thinking skills and are prepared to navigate workplaces where AI supports, rather than replaces, human judgment.

AI offers tremendous potential for higher education, but its integration requires careful planning. Universities can harness AI as a learning partner rather than a shortcut by establishing strong governance, investing in faculty development, protecting academic integrity, balancing AI with essential skills, and fostering an AI-conscious campus culture.

When AI is used deliberately, students learn to harness technology without losing the judgment, creativity, and problem-solving skills that matter most in the workplace.

Related Articles