Last spring, a history professor at Touro University gave undergraduate students a graduate-level challenge: Analyze a mysterious artifact with writing in an unfamiliar language. The trick was that they were encouraged to use artificial intelligence to help.
Rather than replacing student thinking, AI became a catalyst for deeper thought—students debated different analyses, compared AI output to historical sources and explored possible interpretations in ways no lecture or textbook could have provoked.
This experience was made possible by the professor’s disciplinary expertise and his thoughtfulness about student needs. Yet many campuses default to centralized AI initiatives developed by senior administrators or advisory councils that tend to be risk-averse.
Instead, universities should unleash the incredible resource hiding in plain sight: our faculty. Give them the power to experiment with AI while preserving AI-free “sanctuaries” when warranted.
By doing this, we will learn how to teach students to use AI responsibly and redesign our curricula so that AI enriches, rather than undermines, our educational aims.
Broad AI engagement
Some argue that we should banish AI from education altogether, warning that it will replace thinking and hasten the decline of educational institutions. But the genie is out of the bottle; AI is already woven into every workplace and news feed our graduates will encounter.
Teaching critical thinking in an AI-free vacuum is unrealistic and would only hold students back. Critics also overlook AI’s potential to revitalize higher education—running models, offering feedback and sparking creativity.
Properly managed, AI can enhance learning across disciplines. It can help students brainstorm and identify weaknesses in their writing, showing them how to critique, edit and reflect. Interactive AI can offer feedback on oral presentations and simulate debate, enabling students to sharpen arguments and reconsider assumptions.
Of course we must have safeguards: AI-proof assessments, active reflection on AI use and training on its limits. Universities should also establish a broad ecosystem of AI engagement—a platform for faculty to experiment, cross-pollinate insights and share successes and stumbles.
Closest to the work
Economist Friedrich Hayek described the “knowledge problem,” arguing that a central office cannot grasp the nuanced knowledge and insight scattered among individuals. In this vein, a provost’s edict is no substitute for a professor’s judgment about when AI enriches the classroom and when it detracts.
Professors must be permitted to explore this technology to discover how best to use it in philosophy seminars, engineering labs, clinical pedagogy and literary workshops.
This model is backed by research on leadership in higher education. Studies show that by building a culture that values diverse perspectives and giving faculty real decision-making power, universities benefit from stronger innovation and higher morale.
Centralized directives—even those from large, diverse committees—inevitably miss critical nuances. What works in a data science lab may cripple a poetry seminar. Faced with such uncertainty, we risk constant policy revisions or special exemptions that waste time and demoralize faculty.
Faculty innovation should be encouraged with small internal grants or teaching-release fellowships to design and pilot AI applications tailored to their curricula. Thanks to a recent philanthropic gift, nearly 100 faculty across Touro University’s 35 schools and colleges are developing materials to experiment with AI in their fall 2025 courses.
Similarly, the California State University system has funded more than 60 faculty-led projects through its AI Educational Innovations Challenge. Both initiatives put decision-making power and resources in the hands of those closest to the work.
A platform for educational entrepreneurship
But incentives are only the beginning. To foster a culture of discovery and sharing, universities should host regular AI summits and expert-led trainings. What matters is creating ongoing, low-cost opportunities for faculty to swap ideas, problem-solve, and share innovations.
For example, the Generative AI Fellows Pilot Program at the University of Nevada empowers faculty to explore AI’s potential in research and teaching, with support for experimentation and sharing.
Envisioning the university as a platform for educational entrepreneurship is both revolutionary and a return to the academy’s roots—reviving the university as a true “community of scholars.” Faculty become partners in shaping teaching and research, not just implementers of top-down initiatives. Experimentation with AI integration sparks cross-disciplinary dialogue and revives the collaborative spirit of higher education.
Instead of trying to predict trends, institutions should instead focus on testing ideas and fostering genuine innovation. By shifting trust and resources toward those in the field, schools move from guessing what might work to discovering what actually does.
Tenure and promotion guidelines should reward educational creativity, with recognition for faculty who share resources or develop open-source tools. Training modules, mentorship networks and informal gatherings like lunch-and-learns signal that risk-taking is valued, embedding experimentation at the heart of the university’s mission.
Top-down mandates cannot reinvent higher education amid enormous change. We must empower every member of our university communities to experiment, learn and grow.
Our job as leaders is to provide the infrastructure and incentives that foster a culture of innovation so our institutions can not only meet the challenges of AI, but thrive.



