Published on: January 5, 2026 at 11:27 am
By Nick Keppler
Every major technology company has a suite of products for the higher-education market. Apple Classroom and Schoolwork, Google Classroom, and Microsoft’s OneNote for Education, among others, compete for the business of administrators, teachers, researchers, and students at elementary, middle, and high schools, as well as colleges and universities. Like all tech products, they are increasingly bejeweled with AI features.
AI could subtly but fundamentally transform and limit the way we accumulate knowledge, said Academy of Management Scholar Dirk Lindebaum of the University of Bath.
“The genie is out of the bottle, and we can see that a lot of big tech and edtech companies are moving into the education sector,” he said.
Large language models (LLMs), such as Open AI’s ChatGPT, Anthropic’s Claude 3, and Google’s Gemini, swallow up vast amounts of information and can retrieve and recontextualize portions of data with a prompt from a user. But these systems can only provide “very, very narrow insight about a particular subject, not tap into the whole scope of possible knowledge,” he said.
If users—nudged by the advertising and marketing efforts and the habit-forming power of tech giants—constantly dip into and repurpose an existing well of knowledge, can they create anything truly new? Will they be able to think as freely and openly as they would outside the limitations of these tools’ outputs? Lindebaum said he worries about that.
“As more and more big edtech firms enter higher education and create dependencies for academics, we see the transformation from epistemic agents to epistemic consumers,” he said.
While AI handily produces regurgitated or consensus-driven answers to questions, it often excludes new or unique perspectives or ones drawn from personal experience, he said. Also, AI algorithms and LLMs inevitably reproduce the biases embedded in their training material.
This is not just worrisome for academic inquiry, but also for far-ranging civic debates that advance and maintain democracies.
“Universities are also there to educate citizens for active participation in the process of democratic will-formation of a country, and for that to happen, we need informed public debate,” he said. “We need critical-reasoning skills.”
Unlike some creative professionals, such as screenwriters—whose concerns about AI partially fueled the 2023 Hollywood writers’ strike—academics have not pushed back much against the encroachment of AI in their field, Lindebaum added. Oxford University is even partnering with Open AI in a deal that will give students and researchers early access to new AI products.
“I’m afraid to say, I still see too much complacency or inaction from too many colleagues in academia,” Lindebaum said. “On the contrary, I see a lot of people working at universities celebrating this as the next level forward to drive scientific discovery, management research, and also education.
“We need to step back and really understand whose knowledge we are being reliant on for our management research, and whose knowledge we are facilitating in the classroom when we use these technological resources.”