Last updated on: April 13, 2023
AI technology is constantly evolving. The content on this page will be monitored and updated as new information becomes available.
Generative artificial intelligence (AI) is a category of technology that can produce new content, including audio, code, images, text, simulations, and videos. Recent breakthroughs in the field, including the public release of ChatGPT in November 2022, are already radically changing the way individuals and organizations approach content creation. Awesome Generative AI provides an extensive list of generative AI technologies, categorized by type, including text, code, image, video, and audio generators. You are encouraged to check the privacy policies and terms of use for any generative AI tools before applying them.
There are several AI technologies currently garnering significant attention in higher education, such as OpenAI’s ChatGPT, DALL·E 2, and Whisper.
Currently, several generative AI technologies, including ChatGPT, Whisper, and DALL·E 2, are available to access and trial at no cost with limited functionality. You may wish to explore how the tools respond to course assignments or prompts or brainstorm opportunities to use the tools to support class activities or inspire discussions. Because of the overwhelming interest in these systems, users may experience outages or delays when accessing them.
There are many creative opportunities for experimenting with and applying generative AI technology to support teaching and learning. You can introduce and invite students to test AI through ungraded in-class and online activities. Acceptable use of AI in summative (graded) assessments should be determined by the academic program, through its curriculum committee, and may vary across courses and types of assessments. Any explicit requirements about the use of AI generated material, and relevant guidelines or use restrictions should be clearly communicated in the assignment or test descriptions.
Requiring students to apply generative AI tools in summative assessments is complicated by the fact that many have not been vetted by YU for privacy or security and that students may need to disclose personal information to gain access to generative AI services (e.g., phone numbers). If tools like ChatGPT are integrated into course activities or assignments, you and your and students should review the tool’s privacy policy and consider privacy implications. Students should also be offered an alternative option if they do not wish to create an account that would allow them to engage directly with the generative AI tool.
When considering the integration of AI into assessments, especially ones that may involve industry or community partners, or ones that may form part of a student’s professional portfolio or may be submitted as part of a competition or creative entry, it will be especially important to consider any industry-standard expectations and guidelines around AI use or any specific directives from our partner organizations.
Assessments help students to consolidate their understanding, and demonstrate the skills needed to perform in professional roles after graduation. To successfully complete a program, they must independently demonstrate achievement of course and program learning outcomes.
While AI tools may be used to explore disciplines, spark ideas, and support studying, students should openly acknowledge and document their use of generative AI in any coursework (see Student page for examples of appropriate and inappropriate AI use in educational settings). Assignments and tests may have explicit requirements or restrictions on the use of generated AI material. By submitting coursework, students assert that they have respected these requirements. Where students incorporate AI generated material into coursework, they assert that it accurately reflects the facts. They also take full responsibility for AI-generated materials as if they had produced them independently: ideas must be accurately attributed, and facts must be carefully and accurately reported.
A student may not represent AI generated work as their own. A student who uses generative AI to complete any assessment in whole or in part without appropriately acknowledging and citing the AI generated content violates our Academic Integrity policy. Such unauthorized AI use may be considered cheating as described under the Academic Offences section of the Academic Calendar, "Employing any unauthorized academic assistance in completing assignments or examinations," or plagiarism, “the act of representing someone else’s work as one’s own.”
While citation guidelines are evolving, here are some sample APA guidelines for citing text and images sourced from artificial intelligence that may serve as a helpful starting point.
YU is committed to providing students with high-quality, authentic, and actionable feedback informed by the academic and professional experiences of our faculty. You should not submit student work to any third-party software system for grading, or any other purpose, unless the software is approved by YU. A completed assignment is the student’s intellectual property (IP) and should be treated with care.
You should regularly monitor student work for significant changes in the style or quality of their submissions or content and references that, while plausible, are factually incorrect or do not exist. In addition to your own efforts, an AI Indicator (beta-version) is also integrated within Turnitin, our approved, enterprise-level plagiarism detection tool. For more information on how to use the AI Indicator, including its strengths and limitations, review our Turnitin AI Indicator tip sheet. The Turnitin AI Indicator is only one, limited form of evidence of possible academic misconduct and should be applied with caution only if other evidence of integrity issues exists (i.e., marked change in writing style/level, incorrect or falsified references, falsified information or data).
YU discourages the use of other third-party AI-detectors, like GPTZero, on student work. The efficacy of these detectors is limited, and some detectors use the level of writing sophistication to make determinations about whether the piece was generated by AI. Making assumptions that a relatively simply phrased assignment is the work of an AI tool could negatively impact students and result in high rates of false positives. Loading student work into third-party systems may also violate our intellectual property policies.
Generative AI technologies are being applied across many industries in creative ways. In teaching, these systems have shown promise in supporting:
Generative AI technologies are increasingly common across many professions and fields of study. If students are to become critical consumers and ethical users of AI, they will benefit from discussing these technologies with their instructors and peers in a supportive environment. Eaton and Anselmo (2023) offer several suggestions for engaging students in developmental conversations about AI, including:
Speak openly and frequently with your students about your expectations for the use of generative AI tools in your course. You may choose to co-create expectations for AI use with your students or they may already be established at the course- or program-level. In all cases, it is important to:
Generative AI tools can be used in ungraded learning activities. Bringing this tech into lectures or online discussions can help students understand how and when to use AI technology effectively and ethically, and in ways that align with the norms and standards of your disciplinary context. Some learning activities to consider include:
Acceptable use of AI in summative (graded) assessments should be determined by the academic program, through its curriculum committee, and may vary across courses and types of assessments. Any explicit requirements about the use of AI generated material, and relevant guidelines or use restrictions should be clearly communicated in the assignment or test descriptions.
Requiring students to apply generative AI tools in summative assessments is complicated by the fact that many have not been vetted by YU for privacy or security and that students may need to disclose personal information to gain access to generative AI services (e.g. phone numbers). If tools like ChatGPT are integrated into course activities or assignments, faculty and students should review the tool’s privacy policy and consider privacy implications. Students should also be offered an alternative option if they do not wish to create an account that would allow them to engage directly with the generative AI tool.
When considering the integration of AI into assessments, especially ones that may involve industry or community partners, or ones that may form part of a student’s professional portfolio or may be submitted as part of a competition or creative entry, it will be especially important to consider any industry-standard expectations and guidelines around AI use or any specific directives from our partner organizations.
Instructors should model good citation practices when using AI-generated content in their teaching and should encourage learners to do the same. While citation guidelines are evolving, here are some sample APA guidelines for citing text and images sourced from artificial intelligence that may serve as a helpful starting point.
Monitor student work for significant changes in the style or quality of their submissions or content and references that, while plausible, are factually incorrect or do not exist. If the instructor has concerns regarding student work and believes that an academic offense may have been committed, they should follow the procedure outlined in section 5.6.3 of the Academic Calendar.
There are situations where independent skill building is essential and it will be important to limit the influence of generative AI on student work. There are approaches that can be applied when designing and deploying assessments to minimize student reliance on these tools.
Students could be asked to discuss their own individual experiences as they relate to course topics, or to provide responses to real or fictional case studies. The more personally and contextually specific the task, the less relevant a generic response from a tool like ChatGPT will be.
Some generative AI tools have training cut off dates, meaning that they have limited access to more contemporary information. Choosing or inviting students to analyze or apply very recent articles, examples, or cases for their assessments may limit the influence and applicability of generative AI tools.
Emphasize the process over the final product, adding elements such as proposals, drafts, annotation, or feedback into your assignments. Add in a reflective component to an assignment. Ask students to write or record annotations or a holistic self-assessment about their process—what steps did they take and why? Why did they choose a certain answer? What other options did they consider?
Are there acronyms, examples, or ideas that are unique to your course? If yes, build assignments or tests that invite students to apply, reference and/or connect these materials specifically in their responses.
Consider breaking assignments, especially significant ones, into chunks or stages. Students can submit a portion of the assignment, receive feedback from their instructor, their peers, or both, and then demonstrate the integration of that feedback into future components or iterations of the assignment. They may also be invited to reflect on why and how they integrated feedback elements into their developing work.
AI in Higher Education Resource Hub
Produced and maintained by Contact North, this resource hub organizes information according to the following categories: Latest Developments; Background on AI; Learning Experiences, Course Creation & Learner Support; Assessment, Grading, Examinations, and Academic Policy & Concerns.
How AI can be used meaningfully by teachers and students in 2023
Some ideas from staff at the University of Sydney about how educators and students might productively engage with generative AI to enhance higher education learning, teaching, and assessment.
Teaching and Learning with Artificial Intelligence Apps
Dr. Sarah Eaton and Lorelei Anselmo from the University of Calgary review strategies for engaging students in conversation about generative AI and its applications in learning, and revising assessments to focus on creative analysis and personal application.
Sentient Syllabus
An initiative by academics for academics to navigate the uncharted waters of the era of Artificial Intelligence. This collective aims to draft and improve resources and publicly share ideas around the application and impact of AI in higher education. All materials produced for the Sentient Syllabus project are made available under a Creative Commons license.
ChatGPT consistently fails (most parts of) the assessment tasks I assign my students. Here's why.
Jason M. Lodge, Deputy Associate Dean (Academic) and Associate Professor of Educational Psychology, at the University of Queensland reflects on the design of authentic assessments for his education students and the limitations of large language models like ChatGPT.
AI in Education: Guidance for Policy Makers
This big picture view of the use of AI in Education from UNESCO published in 2021 offers some interesting insights into the use of AI in teaching and learning as well as AI’s use in educational administration. It also considers if and how AI can support UN Sustainable Development Goal 4: Ensure inclusive and equitable quality education and promote lifelong learning opportunities for all.
ChatGPT and Articificial Intelligence in Higher Education (UNESCO)
This Quick Start Guide is a short, jargon-free downloadable guide that provides an overview of how ChatGPT works and explains how it can be used in higher education. The Quick Start Guide raises some of the main challenges and ethical implications of AI in higher education and offers practical steps that higher education institutions can take.
The ChatGPT Storm and What Faculty Can Do
This article, from the Nurse Educator journal, provides a simple and thorough list of the capabilities of ChatGPT in an educational setting, as well as its limitations and ethical challenges. The authors offer a series of recommendations to support faculty in building familiarity with ChatGPT and creating assignments that leverage its strength and develop AI literacy among students.
Assessment redesign for generative AI: A taxonomy of options and their viability
The authors offer a taxonomy of assessment redesign options to address the use of AI with the aim of laying the groundwork for further discussion on the potential and feasibility of these methods. In doing so, they hope to shift the conversation beyond simply banning or policing new technology (i.e. focussing only on the means), and towards more constructive and innovative solutions.
Supporting Academic Integrity: Ethical Uses of Artificial Intelligence in Higher Education Information Sheet
The Academic Integrity Council of Ontario recently released a guide with recommendations regarding the use of artificial intelligence in an academic setting. The document aims to provide all higher education stakeholders with general information about artificial intelligence as well as outline some considerations for its ethical use in higher education.
Digital Essentials - Artificial Intelligence
This student-facing resource from the University of Queensland reviews benefits and risks of AI use in academic settings, including connections to academic integrity. The site provides excellent, student-friendly examples and is made available under a Creative Commons license.
Academic Integrity in Canada: An enduring and essential challenge
Edited by Yorkville University President, Julia Christensen Hughes, and academic integrity scholar, Sarah Elaine Eaton, this open access text reviews emerging ideas and perspectives on contemporary academic integrity.