
For years, I’ve taught courses on academic writing and presenting, co-authored a book with designer Elisabeth Sillmann, and evaluated portfolios comprising abstracts, posters, and oral presentations. However, with the launch of ChatGPT in November 2022, I had to revamp my course because most of the teaching methods I used were now feasible for AI. And Elisabeth and I had to rewrite our book—which had since become two books, one for MATLAB and one for Python (Trauth and Sillmann, 2026a, b).
In 2023, the first workshops on AI were organized at the university and I vividly recall the first one I attended. Most participants adopted a defensive stance, citing common – and, of course, entirely justified – arguments against AI usage in teaching and examining, and anticipating university lawyers to draft restrictive clauses for their exams. At that time, I proposed organizing a workshop specifically within the natural sciences. This event turned out much more balanced, possibly because scientists have been using artificial intelligence for a longer period compared to those in other disciplines and therefore did not have a fundamentally negative attitude.
Around that time, I embarked on the task of revamping my course and developing the initial exercises on writing abstracts with AI assistance. The results were highly positive and the students developed a healthy attitude towards AI. In one of the early exercises, they worked in two teams to write short essays on topics such as the termination of the Green Sahara and the emergence of Egyptian civilization. One team diligently refined prompts until the desired output was achieved. The other team had a great time figuring out why the AI output was incorrect and provided a detailed analysis of the results.
But how do you conduct final exams for such courses if you don’t want to restrict the use of AI? As I was thinking about it and more university workshops were being offered, the major publishers were considering how to adapt their policies and guidelines for the age of AI. I was invited to several workshops and surveys organized by these publishers, most notably Springer, as the author of several textbooks published by that publisher. Their current policy is to prohibit the use of AI for writing books and papers, but they allow its use for improving the readability of texts. However, all other uses of AI must be documented, which we as authors are happy to do.
That’s exactly what I do in my courses: AI is permitted, but its use must be documented in detail. And even more, this documentation is part of the submitted work, preferably included in the methodology section. The advantage of this, unlike a declaration of originality, is that I can consider this part of the work in my evaluation. Of course, we’ll discuss what such documentation might look like—basically similar to the documentation used for other methods in the field, in the lab, and in data analysis. And yes, it is the student’s own work that is graded, not the work of an AI. Clever prompting is considered the student’s own work, as is the review and correction of results, and the documentation of the process.
By the way, here’s a little anecdote to wrap things up: several participants in the course asked me about an advanced course. I was more than happy to oblige; we focused on writing concise, accessible press releases and short presentations. We also experimented with AI, but in the end, it was the texts written entirely without AI that generated the most enthusiasm. They were simply different, more original—perhaps more human? But the best part of the course was the personal interaction during the sessions; we spent a long time fine-tuning individual sentences over a cup of coffee until we were delighted with an unusual turn of phrase we’d come up with. A phrasing that was different from what the AI had provided.
What am I trying to say? Well, the students do want to learn how to write. Maybe not all of them, but the vast majority, and especially those who asked me for another course! These students use AI as a source of inspiration, for help with writing, and sometimes just for fun. They quickly learned that they need to write differently than AI does so that someone will actually read it. And, above all, keeps reading! I remember well when Springer asked me in 2019 to write a review of a book on lithium-ion batteries—a machine-generated summary on the topic—long before the introduction of ChatGPT and other chatbots (Beta Writer, 2019). In short, my take on it was: very impressive, but terribly boring.
Important notice, this text is not intended to initiate fundamental debates about the advantages and disadvantages of artificial intelligence. It assumes the existence of artificial intelligence, which is here to stay. The primary objective is to encourage a discussion on how to effectively utilize AI in teaching and assessment. The readabilty of this text has been enhanced by the use of Apple Intelligence. The picture used to decorate the text was generated by ChatGPT using the prompt “Could you please create a nice symbolic image that I can use as decoration for the following text?”
References
Beta Writier (2019) Lithium-Ion Batteries – A Machine-Generated Summary of Current Research. Springer Cham, 247 pages. https://doi.org/10.1007/978-3-030-16800-1.
Trauth, M.H., Sillmann, E. (2026a) Collecting, Processing and Presenting Geoscientific Information with Python – First Edition. Springer Nature, in press.
Trauth, M.H., Sillmann, E. (2026b) Collecting, Processing and Presenting Geoscientific Information with, MATLAB® – Third Edition. Springer Nature, in press.
