Skip to Main Content

Artificial Intelligence

In today's world, Artificial Intelligence (AI) impacts all education fields and is not subject specific. This Research Guide is here to support your research and learning journey in Artificial Intelligence.

University of Calgary Teaching and Learning Supports

Guidance for Instructors: Using AI in Teaching and Assessment

As AI tools like ChatGPT become increasingly common in both academic and professional environments, instructors play a key role in guiding students toward ethical, transparent, and effective use. This section provides practical strategies for integrating AI into your teaching to support learning outcomes while upholding academic integrity.


1. Clarify Expectations Early

Set explicit, transparent policies regarding AI use in your course:

  • Include clear guidelines in your syllabus and course outlines.
  • Distinguish between acceptable uses (e.g., idea generation, grammar refinement) and prohibited uses (e.g., submitting entirely AI-generated work without attribution).
  • Discuss your AI policy in class to prevent misunderstandings.

2. Align AI Use with Learning Outcomes

Reflect on how AI tools can support your course objectives:

  • In skill-building or research-focused courses, AI may assist students with brainstorming, outlining, or summarizing.
  • For assignments emphasizing independent work, AI use may need to be limited or restricted.

Aligning AI use with learning goals ensures it enhances critical thinking and skill development rather than replacing them.


3. Model Responsible Use

Show examples of ethical and transparent AI use:

  • Demonstrate how you use AI in course preparation (e.g., generating quiz questions or summarizing readings) while verifying outputs.
  • Explain how to document AI’s role using discipline-appropriate citation styles.

This sets a precedent for responsible AI engagement.


4. Teach Critical Evaluation Skills

Help students critically assess AI-generated content:

  • Engage students with activities highlighting common AI pitfalls, such as biases, factual errors, or hallucinations.
  • Introduce frameworks like Mike Caulfield’s SIFT Method to fact-check AI claims.

Encourage students to cross-verify AI outputs with peer-reviewed literature or trusted academic sources, fostering deeper information literacy.


5. Require AI Attribution and Reflection

Promote transparency and academic honesty by:

  • Requiring clear acknowledgement when AI tools contribute to assignments.
  • Requesting brief explanations of AI’s role (e.g., “ChatGPT was used to rephrase section 2 and create an outline for section 3”).

This fosters ethical use and facilitates student reflection on AI’s benefits and limitations.


6. Keep Policies Flexible and Updated

Given the rapidly evolving nature of AI:

  • Review and update your AI policies regularly.
  • Encourage open dialogue with students about emerging tools and ethical considerations.
  • Adapt policies as needed in response to new AI capabilities or institutional directives.

AI should augment, not replace, human thought. By setting clear expectations, modeling responsible practices, and teaching critical evaluation, instructors can empower students to navigate AI tools thoughtfully, maintaining academic standards while embracing innovation.

The above have been synthesised from the following links: