Artificial Intelligence in Teaching: Information for Faculty

Last updated on: April 13, 2023

This page has been created to help YU educators understand: 

AI technology is constantly evolving. The content on this page will be monitored and updated as new information becomes available.

What is Generative AI?

Generative artificial intelligence (AI) is a category of technology that can produce new content, including audio, code, images, text, simulations, and videos. Recent breakthroughs in the field, including the public release of ChatGPT in November 2022, are already radically changing the way individuals and organizations approach content creation. Awesome Generative AI provides an extensive list of generative AI technologies, categorized by type, including text, code, image, video, and audio generators. You are encouraged to check the privacy policies and terms of use for any generative AI tools before applying them. 

There are several AI technologies currently garnering significant attention in higher education, such as  OpenAI’s ChatGPTDALL·E 2, and Whisper. 

Generative AI in Teaching & Learning: Institutional Expectations

Can I experiment with generative AI tools to better understand their capabilities and limitations?

Currently, several generative AI technologies, including ChatGPT, Whisper, and DALL·E 2, are available to access and trial at no cost with limited functionality. You may wish to explore how the tools respond to course assignments or prompts or brainstorm opportunities to use the tools to support class activities or inspire discussions. Because of the overwhelming interest in these systems, users may experience outages or delays when accessing them.

Can I use generative AI tools for teaching and learning purposes in my courses?

There are many creative opportunities for experimenting with and applying generative AI technology to support teaching and learning. You can introduce and invite students to test AI through ungraded in-class and online activities. Acceptable use of AI in summative (graded) assessments should be determined by the academic program, through its curriculum committee, and may vary across courses and types of assessments. Any explicit requirements about the use of AI generated material, and relevant guidelines or use restrictions should be clearly communicated in the assignment or test descriptions.

Requiring students to apply generative AI tools in summative assessments is complicated by the fact that many have not been vetted by YU for privacy or security and that students may need to disclose personal information to gain access to generative AI services (e.g., phone numbers). If tools like ChatGPT are integrated into course activities or assignments, you and your and students should review the tool’s privacy policy and consider privacy implications. Students should also be offered an alternative option if they do not wish to create an account that would allow them to engage directly with the generative AI tool.

When considering the integration of AI into assessments, especially ones that may involve industry or community partners, or ones that may form part of a student’s professional portfolio or may be submitted as part of a competition or creative entry, it will be especially important to consider any industry-standard expectations and guidelines around AI use or any specific directives from our partner organizations.

If a student uses generative AI tools to complete an assessment, is this considered academic misconduct?

Assessments help students to consolidate their understanding, and demonstrate the skills needed to perform in professional roles after graduation. To successfully complete a program, they must independently demonstrate achievement of course and program learning outcomes.

While AI tools may be used to explore disciplines, spark ideas, and support studying, students should openly acknowledge and document their use of generative AI in any coursework (see Student page for examples of appropriate and inappropriate AI use in educational settings). Assignments and tests may have explicit requirements or restrictions on the use of generated AI material. By submitting coursework, students assert that they have respected these requirements. Where students incorporate AI generated material into coursework, they assert that it accurately reflects the facts. They also take full responsibility for AI-generated materials as if they had produced them independently: ideas must be accurately attributed, and facts must be carefully and accurately reported.

A student may not represent AI generated work as their own. A student who uses generative AI to complete any assessment in whole or in part without appropriately acknowledging and citing the AI generated content violates our Academic Integrity policy. Such unauthorized AI use may be considered cheating as described under the Academic Offences section of the Academic Calendar, "Employing any unauthorized academic assistance in completing assignments or examinations," or plagiarism, “the act of representing someone else’s work as one’s own.”

If a student is allowed to use generative AI in an assessment, how should they cite it?

While citation guidelines are evolving, here are some sample APA guidelines for citing text and images sourced from artificial intelligence that may serve as a helpful starting point.

Can I use ChatGPT or other generative AI tools to assess student work?

YU is committed to providing students with high-quality, authentic, and actionable feedback informed by the academic and professional experiences of our faculty. You should not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by YU. A completed assignment is the student’s intellectual property (IP) and should be treated with care.

How can I detect the use of generative AI? Should I use detectors like GPTZero or others?

You should regularly monitor student work for significant changes in the style or quality of their submissions or content and references that, while plausible, are factually incorrect or do not exist. In addition to your own efforts, an AI Indicator (beta-version) is also integrated within Turnitin, our approved, enterprise-level plagiarism detection tool. For more information on how to use the AI Indicator, including its strengths and limitations, review our Turnitin AI Indicator tip sheet. The Turnitin AI Indicator is only one, limited form of evidence of possible academic misconduct and should be applied with caution only if other evidence of integrity issues exists (i.e., marked change in writing style/level, incorrect or falsified references, falsified information or data).

YU discourages the use of other third-party AI-detectors, like GPTZero, on student work. The efficacy of these detectors is limited, and some detectors use the level of writing sophistication to make determinations about whether the piece was generated by AI. Making assumptions that a relatively simply phrased assignment is the work of an AI tool could negatively impact students and result in high rates of false positives. Loading student work into third-party systems may also violate our intellectual property policies.

I often tell my students not to be misled by the name 'artificial intelligence' - there is nothing artificial about it. AI is made by humans, intended to behave like humans, and, ultimately, to impact humans' lives and human society.

Fei-Fei Li

Strategies for Using Generative AI in Teaching

Opportunities

Generative AI technologies are being applied across many industries in creative ways. In teaching, these systems have shown promise in supporting:

  • Personalized learning by summarizing information, providing feedback, and recommending study materials based on individual needs.
  • Learning assistance by answering students' questions to help them understand difficult concepts.
  • Research assistance by suggesting relevant papers and articles that can assist with literature reviews and answering research questions.
  • Language learning by conversing with students in the language they are learning. Additionally, it can provide feedback on grammar and vocabulary usage.
  • Writing quality by offering tips to improve sentence structure, grammar, and word choice. It can also provide feedback on plagiarism and how to prevent it.
  • Creative spark by quickly generating text or images that can launch students into an activity or assessment.
Cautions & Considerations
While it's important to engage students with technologies that are affecting their personal and professional lives, you need to be mindful of issues that might negatively impact student access, equity, and privacy. Taking time to critically analyze these elements will help you to craft learning experiences and assignments that are truly supportive and transformational. As you consider experimenting or applying generative AI in your teaching, remember:
  • Many of these tools come at a financial cost: There are significant financial barriers to accessing many generative AI systems, particularly as they are increasingly privatized and commodified. Most require a transaction fee or subscription, or operate on a "freemium" model, enabling limited free access to the tool while only allowing paid subscribers to access the full feature set. For example, OpenAI, the creators of ChatGPT, which was released to the public for free during a research preview, launched ChatGPT Plus on February 1, 2023, a monthly subscription plan that promises users faster response times and priority access to new features and functionalities.
  • Emerging tools may not meet YU  privacy and security standards: The servers that host generative AI models could be vulnerable to data breaches or hacking attacks, which could expose user data. These systems may also use third-party integrations or services, which could potentially expose user data to those third-party providers. If the tools have not been internally vetted by YU and TFS, we cannot be assured that they have appropriate privacy and security protocols in place.
  • Training data includes and amplifies bias: Most of the content on which many large language models, like ChatGPT, were trained are written in English and reflect Western norms and perspectives. As the use of generative AI increases, the output may reinforce and amplify the biases of dominant cultures.
  • Training data can be time-limited: Some AI systems, like the research preview of ChatGPT, have training cut off dates. For example, ChatGPT was only trained on data produced before September 2021, so the tool does not have access to the most contemporary information.
  • Access to generative AI tools vary by country: For example, ChatGPT is currently banned or inaccessible in countries like Italy, China, Venezuela, and Iran. These restrictions may create access challenges for faculty and students.
  • Unclear copyright implications: Who owns works made by generative AI? This and many other questions related to copyright are currently being contested with no legal consensus in Canada. An article on the legal status of generative AI from the Canadian Bar Association provides a more in-depth analysis.

This interactive graphic describes additional harm considerations that are valuable to review as you plan for the inclusion of generative AI tools in teaching and learning.
Engage Students in Conversation

Generative AI technologies are increasingly common across many professions and fields of study. If students are to become critical consumers and ethical users of AI, they will benefit from discussing these technologies with their instructors and peers in a supportive environment. Eaton and Anselmo (2023) offer several suggestions for engaging students in developmental conversations about AI, including:

  • Asking about your students' current AI literacy. Consider questions like,
    • What do you know about artificial intelligence apps?
    • Have you used them before? Which ones and in what ways?
    • How can you ethically use AI apps to support your learning?
    • What are some limitations of these tools?
    • Do you know of any ways in which these tools are currently being applied in our discipline?
  • Establishing an AI-focused discussion thread on your Moodle site or a collaborative digital document where the class community can ask questions, share information, or post resources.
  • Proactively learning about how specific tools will collect and use your personal information and any inputted data. Like many other software/apps available on the web, access to AI-based tools requires users to agree with specific terms of service and privacy policies. Instructors and students should carefully read these documents and understand the risks associated with their use, prior to accepting the terms.
Set and Communicate Clear Expectations

Speak openly and frequently with your students about your expectations for the use of generative AI tools in your course. You may choose to co-create expectations for AI use with your students or they may already be established at the course- or program-level. In all cases, it is important to:

  • Be clear and consistent. Communicate and reinforce use expectations through multiple channels, including the course syllabus, Moodle site, assignment description, rubric, and/or verbally in-class.
  • Explain the rationale. Acceptable use of AI may vary across programs, courses, and/or types of assessments. Where possible, explain to students the rationale for including or excluding generative AI tools from your activities or assessments.
Develop Effective Prompts for AI Tools
If students will be exploring generative AI in your course(s), they should be aware that the quality of the outputs from these systems are directly related to the quality of the questions users ask. Here are some recommendations for developing effective prompts:
  • Be specific: The more specific your question is, the more accurate the response will be. Avoid asking open-ended questions that can have multiple answers.
  • Use good grammar and punctuation: Many deep learning models perform best when prompts are composed of well-formed sentences.
  • Provide context: If your question is related to a specific topic or field, provide some context to help the technology understand the question better. If you would like content in a particular style, indicate that in your prompt. For example, you might ask ChatGPT to explain a concept in language appropriate for a 5th grade student in New Brunswick, Canada. Or, you might ask DALL·E 2 to generate drawings of the furniture layout for a 16 x 20 ft living room in a mid-century modern style.
  • Be concise: Keep your question short and to the point. Long and complex sentences can be difficult for these technologies to understand.
  • Avoid asking for or inputting personal information: It is important to be aware of how the information that you provide to these technologies will be stored, applied, and protected. Avoid inputting or requesting personal information about yourself or other individuals.
  • Be patient: Depending on the complexity of the prompt, it can take some time for these tools to generate a response. Also you may experience outages or delays when using publicly available tools, including ChatGPT and DALL·E 2, due to high demand.
  • Be aware of the limitations: Generative AI systems have been trained on large, but not exhaustive, datasets. For example, some systems, like ChatGPT, may not have knowledge of certain information or events that have occurred after their knowledge cut-off date. They might also struggle with certain types of questions, like understanding sarcasm or idiomatic expressions, and may generate false, inaccurate, or derogatory information.

These recommendations have been adapted from "Digital Essentials" by University of Queensland Library Creative Commons Attribution-NonCommercial 4.0 International License.
In-Class Activities

Generative AI tools can be used in ungraded learning activities. Bringing this tech into lectures or online discussions can help students understand how and when to use AI technology effectively and ethically, and in ways that align with the norms and standards of your disciplinary context. Some learning activities to consider include:

  • using AI generated text as the starting point for class discussion on a particular topic. What does it get right? What is missing? How would it need to be revised to meet the scholarly standards of your field? Charlie Warren, associate professor at the University of Sydney, published an in-class activity plan for a recent lecture that illustrates engaging students in an exploration of generative AI outputs,
  • offering examples or inviting students to find examples of how AI is being applied in your field. What new opportunities does AI afford? What are the ethical questions and implications that its use raises?,
  • having small teams of students experiment in using AI to create texts, images, or codes, and then compare the results (what grade would they assign its response using a course rubric?) and/or the process (what prompts and tweaks were needed to generate and improve the content?)
  • creating practice questions and feedback. The instructor or students can create review questions to prepare for an upcoming test or exam using a tool like ChatGPT. For example, you might use a sample prompt like: Write three university-level multiple choice questions that target key student misconceptions around how natural selection works. Include feedback for students. Review the questions with your class and check the accuracy of the responses.
Incorporating AI in Graded Assessments

Acceptable use of AI in summative (graded) assessments should be determined by the academic program, through its curriculum committee, and may vary across courses and types of assessments. Any explicit requirements about the use of AI generated material, and relevant guidelines or use restrictions should be clearly communicated in the assignment or test descriptions.

Requiring students to apply generative AI tools in summative assessments is complicated by the fact that many have not been vetted by YU for privacy or security and that students may need to disclose personal information to gain access to generative AI services (e.g. phone numbers). If tools like ChatGPT are integrated into course activities or assignments, faculty and students should review the tool’s privacy policy and consider privacy implications. Students should also be offered an alternative option if they do not wish to create an account that would allow them to engage directly with the generative AI tool.

When considering the integration of AI into assessments, especially ones that may involve industry or community partners, or ones that may form part of a student’s professional portfolio or may be submitted as part of a competition or creative entry, it will be especially important to consider any industry-standard expectations and guidelines around AI use or any specific directives from our partner organizations.

Cite Your AI Use

Instructors should model good citation practices when using AI-generated content in their teaching and should encourage learners to do the same. While citation guidelines are evolving, here are some sample APA guidelines for citing text and images sourced from artificial intelligence that may serve as a helpful starting point.

Monitor and Act on Concerns

Monitor student work for significant changes in the style or quality of their submissions or content and references that, while plausible, are factually incorrect or do not exist. If the instructor has concerns regarding student work and believes that an academic offense may have been committed, they should follow the procedure outlined in section 5.6.3 of the Academic Calendar.

Strategies for Limiting the Influence of Generative AI on Assessments

There are situations where independent skill building is essential and it will be important to limit the influence of generative AI on student work. There are approaches that can be applied when designing and deploying assessments to minimize student reliance on these tools. 

Enhance the authenticity of the assessment

Students could be asked to discuss their own individual experiences as they relate to course topics, or to provide responses to real or fictional case studies. The more personally and contextually specific the task, the less relevant a generic response from a tool like ChatGPT will be.

Increase the currency of the assessment

Some generative AI tools have training cut off dates, meaning that they have limited access to more contemporary information. Choosing or inviting students to analyze or apply very recent articles, examples, or cases for their assessments may limit the influence and applicability of generative AI tools.

Focus on reason, decision-making, and process

Emphasize the process over the final product, adding elements such as proposals, drafts, annotation, or feedback into your assignments. Add in a reflective component to an assignment. Ask students to write or record annotations or a holistic self-assessment about their process—what steps did they take and why? Why did they choose a certain answer? What other options did they consider?

Reference specific class discussions or content

Are there acronyms, examples, or ideas that are unique to your course? If yes, build assignments or tests that invite students to apply, reference and/or connect these materials specifically in their responses.

Chunk the assessment

Consider breaking assignments, especially significant ones, into chunks or stages. Students can submit a portion of the assignment, receive feedback from their instructor, their peers, or both, and then demonstrate the integration of that feedback into future components or iterations of the assignment. They may also be invited to reflect on why and how they integrated feedback elements into their developing work.

Resources and Links

Generative AI and Teaching

AI in Higher Education Resource Hub 
Produced and maintained by Contact North, this resource hub organizes information according to the following categories: Latest Developments; Background on AI; Learning Experiences, Course Creation & Learner Support; Assessment, Grading, Examinations, and Academic Policy & Concerns. 

How AI can be used meaningfully by teachers and students in 2023 
Some ideas from staff at the University of Sydney about how educators and students might productively engage with generative AI to enhance higher education learning, teaching, and assessment. 

Teaching and Learning with Artificial Intelligence Apps 
Dr. Sarah Eaton and Lorelei Anselmo from the University of Calgary review strategies for engaging students in conversation about generative AI and its applications in learning, and revising assessments to focus on creative analysis and personal application.

Sentient Syllabus 
An initiative by academics for academics to navigate the uncharted waters of the era of Artificial Intelligence. This collective aims to draft and improve resources and publicly share ideas around the application and impact of AI in higher education. All materials produced for the Sentient Syllabus project are made available under a Creative Commons license.

ChatGPT consistently fails (most parts of) the assessment tasks I assign my students. Here's why. 
Jason M. Lodge, Deputy Associate Dean (Academic) and Associate Professor of Educational Psychology, at the University of Queensland reflects on the design of authentic assessments for his education students and the limitations of large language models like ChatGPT.

AI in Education: Guidance for Policy Makers
This big picture view of the use of AI in Education from UNESCO published in 2021 offers some interesting insights into the use of AI in teaching and learning as well as AI’s use in educational administration. It also considers if and how AI can support UN Sustainable Development Goal 4: Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all. 

ChatGPT and Articificial Intelligence in Higher Education (UNESCO)

This Quick Start Guide is a short, jargon-free downloadable guide that provides an overview of how ChatGPT works and explains how it can be used in higher education. The Quick Start Guide raises some of the main challenges and ethical implications of AI in higher education and offers practical steps that higher education institutions can take.

The ChatGPT Storm and What Faculty Can Do

This article, from the Nurse Educator journal, provides a simple and thorough list of the capabilities of ChatGPT in an educational setting, as well as its limitations and ethical challenges. The authors offer a series of recommendations to support faculty in building familiarity with ChatGPT and creating assignments that leverage its strength and develop AI literacy among students.

Assessment redesign for generative AI: A taxonomy of options and their viability

The authors offer a taxonomy of assessment redesign options to address the use of AI with the aim of laying the groundwork for further discussion on the potential and feasibility of these methods. In doing so, they hope to shift the conversation beyond simply banning or policing new technology (i.e. focussing only on the means), and towards more constructive and innovative solutions.

Generative AI and Academic Integrity

Supporting Academic Integrity: Ethical Uses of Artificial Intelligence in Higher Education Information Sheet  
The Academic Integrity Council of Ontario recently released a guide with recommendations regarding the use of artificial intelligence in an academic setting. The document aims to provide all higher education stakeholders with general information about artificial intelligence as well as outline some considerations for its ethical use in higher education.  

Digital Essentials - Artificial Intelligence 
This student-facing resource from the University of Queensland reviews benefits and risks of AI use in academic settings, including connections to academic integrity. The site provides excellent, student-friendly examples and is made available under a Creative Commons license. 

Academic Integrity in Canada: An enduring and essential challenge 
Edited by Yorkville University President, Julia Christensen Hughes, and academic integrity scholar, Sarah Elaine Eaton, this open access text reviews emerging ideas and perspectives on contemporary academic integrity. 

AskYU