Skip to Main Content

Artificial Intelligence

In today's world, Artificial Intelligence (AI) impacts all education fields and is not subject specific. This Research Guide is here to support your research and learning journey in Artificial Intelligence.

Before Using AI in Your Courses

When using AI tools in your learning, here are some suggestions for ethical and responsible ways to do so.

Before using an AI tool for your coursework: Have a conversation with your instructor regarding AI use on your assignments and research. If you are unsure whether use of a specific tool or using AI tools in general is allowed in your course, reach out to your instructor. Having conversations early is the best way to avoid confusion.

Explore AI software and tools to understand what they can and cannot do, especially with topics you already know a lot about. Take the time to critically analyse their response. AI often lacks the critical thinking skills needed to complete your assignments.  

Some ways students have been using AI tools in their coursework: 

  • asking for comments and feedback on their assignments and papers 
  • preparing for debates by looking at counter-debate arguments
  • further explanation on topics they found confusing when they came up in class or in assignments.

University of Calgary AI Supports

Centre for Artificial Intelligence Ethics, Literacy and Integrity at the University of Calgary. CAIELI is a transdisciplinary initiative between Libraries and Cultural Resources and Werklund School of Education.

These Generative AI Tools for Research (Fall 2024) CAIELI workshops explore the emerging role of generative AI tools in identifying relevant literature for academic projects. Participants will gain insights into various AI powered platforms designed to support research and academic writing and learn to identify the appropriate tools to fit their needs.

These sessions will cover:

  • Identifying and evaluating generative AI tools relevant to literature searching and understanding how to incorporate them into the research process
  • Selecting appropriate tools for research needs
  • Critically assessing the limitations and ethical implications of AI-powered tools in academic work

Explore the resources at the Generative AI Use in Graduate Studies at the University of Calgary

Critical Thinking & AI Tools

Why Critical Thinking Matters When Using AI Tools

When you use generative AI tools for assignments, research, or learning, it’s essential to think critically about the information they provide. These tools can produce convincing, well-written answers, but they don’t always show where their information comes from. Without clear and reliable sources, it’s difficult to verify whether the content is accurate.

This lack of transparency means there is always a risk that the AI might present outdated facts, incomplete explanations, or even completely incorrect claims. In some cases, it may unintentionally spread misinformation, false details, or misleading details about a topic. The AI isn’t “lying” in the human sense; rather, it generates responses by predicting what words or ideas are likely to come next based on patterns in the data it was trained on and within it's reasoning model. 

As a student, your responsibility is to evaluate AI-generated answers the same way you would evaluate any other source:

  • Check facts against reliable academic or professional references.

  • Identify biases — Is the answer presenting one perspective without acknowledging others?

  • Look for missing details that might be important for a nuanced understanding.

  • Cross-verify with peer-reviewed articles, textbooks, or trustworthy websites recommended by your instructors or your university library.

By approaching AI tools with curiosity but also skepticism, you can benefit from their speed and flexibility while safeguarding the quality and credibility of your own work. Remember: AI should be a starting point to guide your research — not the final word on a subject.

The SIFT Method

The SIFT method, developed by digital literacy expert Mike Caulfield, is a practical framework designed to help users critically evaluate information found online. Especially in university settings, where students increasingly rely on digital sources for research and assignments, SIFT empowers you to address misinformation risks by prompting specific steps. 

Instead of immediately trusting or sharing content, you begin by pausing to check your emotional reaction and ask yourself if you already know and trust the source. This simple act of stopping helps interrupt the urgency and bias that digital media often encourages, making space for reflection and skepticism before you engage further.

For university students, using SIFT alongside your own critical thinking skills creates a more robust research process. If you encounter a persuasive claim or striking image, you investigate who published it, what their expertise and agenda might be, and whether the information is supported by more reliable or consensus sources.

By seeking out original reporting or empirical studies (rather than simply accepting summaries or secondary interpretations), you gain context and clarity. The SIFT method doesn’t just help you verify facts; it teaches you to be more mindful of bias and the limits of online media, so your academic work rests on accurate, trustworthy foundations. Using the SIFT method equips you with a systematic way to think critically and protect yourself from misinformation, ensuring your academic work is based on accurate and credible information. It encourages curiosity combined with skepticism; a vital balance in today’s digital research environment. 


The SIFT Method for Evaluating Online Information

S: Stop

Pause before you react, share, or believe the information. Consider your emotional response—does the headline or claim provoke a strong reaction? Don’t rush to accept or spread information without reflection. Ask yourself:

  • Do I know this source?
  • What is my purpose for looking at this information?
  • Could this be sensationalized or misleading?

I: Investigate the Source

  • Looking up their credentials and expertise.
  • Checking the website’s "About" page or mission statement.
  • Considering potential biases or conflicts of interest.
  • Using lateral reading—open other tabs to see what reputable sources say about this source or author.
  • Find out who is behind the information. Research the author, website, or organization publishing it by:

F: Find Better Coverage

  • Check well-known news outlets, academic publications, or fact-checking sites.
  • Compare multiple perspectives to get a fuller, more balanced picture.
  • Be wary if you can’t find credible sources confirming the claims.
  • Look for other trustworthy sources reporting on the same topic. Instead of relying on a single source:

T: Trace Claims, Quotes, and Media Back to the Original Context.

Identify the original source of the information and verify it directly. This involves:

  • Finding original studies, reports, or interviews cited.
  • Checking whether quotes or statistics are used accurately and in full context.
  • Being cautious about manipulated or out-of-context images, videos, or snippets.

Citing AI Tools

The consensus on citing AI tools like ChatGPT has evolved beyond simply labeling them as "personal communication."

Most major citation styles, including APA, now recommend citing AI-generated content similarly to software or algorithm outputs, listing the AI model as the author (e.g., OpenAI), specifying the version and date of the tool used, and including a URL to the AI platform. This citation format acknowledges that ChatGPT responses are non-recoverable by others (since they are dynamically generated and not archived) but treats the AI as a formally authored software tool.

For instance, APA style guidelines suggest referencing ChatGPT with the author as OpenAI, the title as ChatGPT (with the version date), a descriptor like [Large language model], and the URL https://chat.openai.com. In-text citations use the author and year of the tool's version you employed. Additionally, APA advises explaining the AI's use in your methodology or introduction, including the exact prompt you used if you quote or paraphrase the AI content.

Despite these formal citation guidelines, it remains important to check with your instructor to confirm whether using and citing AI tools like ChatGPT or Microsoft Copilot is permitted in your assignments, as acceptance varies across courses at the University of Calgary. Moreover, because AI tools can generate inaccuracies or fabricated sources, critical evaluation and corroboration using verified academic sources are essential. When citing AI, clarify how you used it (e.g., for drafting, idea generation, or editing) to maintain transparency in your research process. This balanced approach ensures academic integrity while leveraging AI's benefits responsibly.

For exact citation style support, visit this page.