Artificial Intelligence at UW

Last updated November 2024

Generative artificial intelligence (genAI) tools are not the first technological advances that have raised concerns at the University of Waterloo and beyond and will not be the last technologies to have important implications for how we teach and how learning happens at Waterloo.

Our response has always been to move beyond policing the use of the technology to asking what the appropriate role is for the new technology in our pedagogy and our curricula, and to use it as a way to improve student learning by adapting our teaching. Sorting this out will be a long-term process and will be the work of many contributors with a range of expertise. Many questions center around academic integrity and effective uses for these tools. At Waterloo, we know that innovations in technology often produce innovations in teaching and learning, and these new developments are no exception. 

University of Waterloo staff, faculty, have access to Microsoft Copilot through their institutional account. Microsoft Copilot should be used through your UWaterloo account in order to protect the privacy and security of the data you enter. 

The information on this page is maintained by the University of Waterloo's Standing Committee on New Technologies, Pedagogies, and Academic Integrity, a group of interested faculty members, staff, and students convened by the Associate Vice-President, Academic with support from the Centre for Teaching Excellence and the Office of Academic Integrity. The remit of this group is to be aware of emerging tools and platforms, formulate advice on appropriate adaptations of pedagogical practices, and make recommendations about how to effectively modify policies and practices in support of academic integrity.

For additional information about Waterloo's stance on the use Generative AI at the University of Waterloo, please visit the Generative AI at the University of Waterloo slide deck.  

On campus support

Course and assignment redesign Centre for Teaching Excellence (Faculty liaisons)
Online course and assignment redesign Centre for Extended Learning (Agile development team)
Designing written assignments Writing and Communication Centre
Encouraging students to work with integrity Office of Academic Integrity
Citing and disclosing genAI Library

More about genAI at Waterloo

ChatGPT and similar technologies are artificial intelligence backed chatbots that can mimic human conversations and writing. Considering the speed with which generative AI tools are emerging, the following information refers specifically to Chat GPT as one example of these types of tools.

ChatGPT is a Large Language Model that learns the statistical structure of language, such as patterns of word usage, to generate answers based on probability distribution over word sequences. As Chat GPT composes an answer, it determines the most likely word or sequence that should go next, based on the training that it has had to date. These tools can be used for a variety of tasks including drafting emails or blog posts, composing essays, and even generating, debugging, and documenting code. This technology is particularly powerful as it can mimic writing or coding styles relatively effectively, making it particularly flexible and widely applicable.

Some see this technology as the next generation of word processing tools, like predictive text, grammar checkers. Rival companies are releasing their own generative AI chatbots, and tools for nearly any purpose are already available or in development.

While generative artificial intelligence (AI) tools are not new, the quality of responses generated by newer models like ChatGPT surpasses prior AI-based writing tools, sparking debate in the higher education community about its use in teaching and learning. Many questions centre around academic integrity, and effective uses of such tools. At Waterloo, we know that innovations in technology often produce innovations in teaching and learning, and this new development is no exception.

Considerations while using Generative AI

Limitations of generative artificial intelligence

There are limitations to genAI tools, which provide a response based on a prompt. For example, the free version of ChatGPT uses information found freely on the internet (up to 2021) to generate responses to prompts but is not a search engine. It “learns” through access to information curated by researchers, developers, and ongoing input from its users—which may contain errors, misconceptions, or bias (Wu, 2022). Of course, this field is developing rapidly, and at the time of publishing these guidelines, similar tools are being adopted by search engines like Google. Verification and triangulation of information is important if using ChatGPT, as it is unable to distinguish between important and unimportant errors (Ramponi, 2022). While it can – if asked – provide references for content found, they will often appear plausible but be incorrect (Montti, 2022).   

Ethical considerations and bias

The responses generated by genAI can be incorrect and may include bias (Wu, 2022) inherent to the database on which it was trained and/or due to the biases of those reviewing and selecting the information to include in the database. One writing specialist tested ChatGPT’s response to a prompt and found that it reinforced rather than complicated a simplistic ideological stance – unlike Wikipedia or even Google searchesThe specific phrasing of prompts impacts how the AI responds, including the degree of bias in the response. GenAI chatbots such as Google’s Sparrow and Galactica were unsuccessful in their releases for this very reason (Wu, 2022). 

Privacy and security concerns

GenAI evolves through its interactions with humans in the form of prompts and chat interactions. As of February 2023, if users delete their accounts, OpenAI retains all the prompts they have added. As of February 2023, ChatGPT has not undergone an Information Risk Assessment at the University of Waterloo. OpenAI’s commonly asked questions page has more information on the company’s own policies.

Copyright concerns

GenAI and related technologies are subject to existing laws and regulations in Canada, such as intellectual property, copyright and privacy laws, among others. The legal status (e.g., copyright legislation and case law) of GenAI services is currently unsettled in Canada. More information is available from Waterloo's Copyright Advisory Committee

Teaching and Learning with AI

Explicit communication with students

It is important for instructors to be explicit about whether artificial intelligence or tools may be used to complete assignments, tests or exams. If they are allowed, they need to be explicit about the extent to which they are allowed, along with citation guidelines. A student who does not comply with the instructor’s rules about the use of such tools will be subject to Policy 71 and an investigation into academic misconduct.

Detecting whether a student has used AI to work on an assignment

We do not recommend the use of AI surveillance or detection technology.

Waterloo supports Turnitin's suite of tools; however the AI detection tool is unreliable and cannot be used as the single source of evidence for an academic offence. Note that the current boilerplate Turnitin language applies to both originality reports and the AI detection report.

Low tech strategies to deter the inappropriate use of generative AI should be employed such as encouraging students to keep their draft work, rough notes, references, and any prompts (if allowed) that they might have used. Students should be informed that they may be asked to show their work process, demonstrate their knowledge or speak to their work upon request by their instructor. 

Reducing the risk of student cheating with AI

Evidence-based assignments and assessments can help encourage students to complete their own work and make unapproved AI chatbot use easier to identify. When adopting any approach (e.g., visual mapping or oral presentation), keep accessibility and inclusion at the forefront of redesign.

Staff at the Centre for Teaching Excellence, the Centre for Extended Learning, the library, and the Writing and Communication Centre are available to consult on assessment redesign. The Office of Academic Integrity can suggest strategies to maintain integrity and discourage academic offences. 

The Centre for Teaching Excellence has collaborated with other academic support units to develop resources for instructors.

Inspiration for assignment redesign: 

  • Research-based assignments that require scholarly resources cited accurately, especially resources found behind a paywall through Library subscriptions 
  • Assignments that show how a work evolved over time (e.g., requiring the submission of scaffolded assignment components, or drafts, or tracking changes) 
  • Auditory or audio-visual presentations (e.g., live or asynchronous presentations) 
  • Use of social annotation tools, like university-supported Perusall, that requires students to respond to content in concert with their classmates 
  • Assignments or assessments that require content to be produced in multiple and accessible modes (e.g., create a visual or infographic and then write about it) 
  • Assignments where students are asked to include real world or personal examples, or demonstrate how the answer ties with course content or a case study discussed in class 
  • Assignments that require metacognitive skills like reflection on the content and process before, during, and after submission 
  • Assignments that are about very recent news or developments in a field

Using AI tools effectively for teaching and learning

There may be opportunities to leverage AI to enhance teaching and learning either by teaching about AI or teaching with AI tools. Introducing AI through discipline-specific classroom conversations can encourage students to think critically about AI and its potential societal implications like accuracy, privacy and security, and bias. Activities or assessments that require learners to analyze, improve, or critically evaluate text or code generated by chatbots can help develop students’ higher order skills. 

Examples: 

  • Provide a prompt and the resulting text, and ask students to improve on it using track changes 
  • Ask students to critique and grade a properly referenced prompt and text, and then write an improved text collaboratively in class.  
  • Generate two different texts and ask students explain the shortcomings of each or combine them in some way using track changes 
  • Document code and test documentation accuracy with peer 
  • Facilitate authentic learning: use AI to create personalized case studies for student groups, tailored to student interests. 
  • Use ChatGPT to provide a starting summary of a debate or issue as a springboard for research and discussion – identify what it is missing.  

Requiring students to use AI for assessment

There are many possible uses of generative AI for assessment; ChatGPT is not fully accessible globally, and now has paid subscription models. If used in assessments or activities, plan alternatives. One option is to require students to use ChatGPT content provided by the instructor, instead of asking all students to sign up for this particular tool. 

Using AI to create or grade instructional materials

GenAI tools have functionality that may create efficiencies for planning instruction and producing content. . The University of Sydney has prepared advice about how AI can be used meaningfully by teachers and students in 2023.