Introduction

Faculty and students at Lehigh are focused on artificial intelligence (AI) and its place in university classrooms. AI as an aid to productivity. AI as a coding assistant and teaching consultant. AI for improvisation and shared humor. Also: students’ use of AI as incongruent to academic integrity?

 This web page on AI and approaches to teaching and learning will:

  1. help faculty talk with students and set policies regarding the use of AI; and

  2. aid faculty in the design of assignments and assessments related to AI.

Further reading and resources

What is Generative AI?

In a few words, “AI can be defined as ‘automation based on associations.’”In longer form: “AI is a branch of computer science. AI systems use hardware, algorithms, and data to create ‘intelligence’ to do things like make decisions, discover patterns, and perform some sort of action.” Generative AI refers to AI systems that produce text and images, among other possibilities. 

What is ChatGPT?

ChatGPT is an AI-powered text generator. It is also a chatbot and an application on the Internet. ChatGPT is powered by a large language model, or LLM. At their inception, LLMs depend on text mining and/or web scraping to build a corpus for study by a computer. An LLM then learns based on rules, or algorithms, set by developers, as well as training by humans.

LLMs “learn a probability distribution of the next word/pixel/value in a sequence.”This means that they do not output whole sentences from a text-mined foundation based on a user's prompt. Instead, LLMs and their chatbot intermediaries build outputs word by word, based on machine learning. This is one reason why proper citation becomes an issue for people who rely on text generators. Outputs from ChatGPT are probable, plausible but they do not draw from specific, reliable sources.  This is also why text generators make up references, or hallucinate, even when a user asks for citations within a prompt.4

The impact of AI

In a recent critical literature review, researchers identified a discourse of imperative change surrounding AI in higher education.5 But that imperative should be critiqued and analyzed in specific scenarios of teaching and learning. Faculty bandwidth is wide and limited. Take time to consider the impact of AI on your course and field of study. Know that there will be many opportunities for change in semesters and years to come. 

Speaking to students about AI

Break the ice

Use a quote, image, multimedia, the news, or movie reference to break the ice of a difficult (and technical!) issue such as the appropriate use of AI-powered tools in teaching and learning.6 Ask questions to open up a conversation about students’ use of AI. Are they using AI-powered tools? If not, why?7 If so, how?8 Aim to create a community of inquiry in which you learn from students and students learn from you.

Trust your students

One reason to introduce AI through a movie reference or news coverage is to pique students’ interest while addressing a pressing issue. Thinking about movies helps us identify anxieties related to AI and realize that a productive place, pedagogically, is to trust your students.  

Summarize your policies

Set aside time early in a course to clarify your policies related to AI. At a time of technological change, such policies will be uneven from a student perspective and across their course load. Some faculty will prohibit the use of AI in their classrooms. Others will make AI a constitutive part of the learning environment they create. Consider adding a policy on AI to your course syllabus.  An LU Syllabus Template developed by Greg Reihman provides a few examples. Feel free to select one – or modify the language as you see fit.9

AI detectors

During the 2023 spring and fall semesters, instructors had access to the Turnitin AI detection tool as part of the university's ongoing license. For spring 2024, Turnitin now requires a separate license to access this AI detection functionality. LTS is currently evaluating this tool and will report out to faculty on whether we deploy this or another tool.

In all cases, be sure to have your students review Undergraduate and Graduate Student Senate Statements on Academic Integrity and Article III of Lehigh's Code of Conduct.

Foreground equity

Foreground discussions of equity and algorithmic biases in your first discussions with students about AI. One example of algorithmic biases within AI systems involves police departments that use facial recognition programs to compare surveillance footage to a database of potential suspects. Algorithmic decision-making is also easily polluted by the racial and gender biases embedded in our society, which are reflected in large training sets and the actions of developers.11  Gender biases – Who wasn't working hard enough?, Who was late? – are prominent in these responses to prompts in ChatGPT.12   

Explore AI tools together

You will be better prepared to lead students through a critical use of ChatGPT and other AI tools with a general understanding of how text-generators are built and operate. You can also practice prompting, together. What is ChatGPT good at? Where does ChatGPT struggle? Have students fact check and refine outputs, identify biases, and compete with ChatGPT on responses to course-related queries. 

Consider students' professional development

The current conversation around AI is market-driven. According to a recent report from Stanford University, “The demand for AI-related professional skills is increasing across virtually every American industrial sector.”13 A contribution of a Lehigh education, however, is to think of the ethical use of AI alongside possible adoption in specific disciplines, to track misuse, and to imagine a better future marked by AI.

Follow the news

One way to stay up-to-date with the ever-changing landscape of AI is with a newsreel created by Future Tools. The AIAAIC also keeps a list of “AI, algorithmic, and automation incidents and controversies.”

The Next Generation

AI-powered tools generate prose, poems, music lyrics, and code. AI for image and video generation is already here. So what's next? Chart a path for the future of AI alongside developers and policymakers. One that considers intellectual property and copyright, ethical labor practices, public safety without over policing, and a cleaner planet. In the words of Jean-Luc Picard: Engage!

Footnotes

1. Miguel Cardona, et al., “Artificial Intelligence and the Future of Teaching and Learning,” U.S. Department of Education, 2023. Source

2. Pati Ruiz and Judi Fusco, “Glossary of Artificial Intelligence Terms for Educators,” The Center for Integrative Research in Computing and Learning Sciences, 2023. Source.

3. Will Thompson, “What We Know About LLMs (Primer),” July 23, 2023. Source.

4. Benj Edwards,  “Why ChatGPT and Bing Chat are so good at making things up,” Ars Technica, April 6, 2023. Source.

5. Margaret Bearman, Juliana Ryan, and Rola Ajjawi, “Discourses of artificial intelligence in higher education: a critical literature review,” Higher Education 86 (2023), 369-85. Source.

6. You can find a list of movies and TV shows with AI-related themes in this Lehigh guide to “Generative Artificial Intelligence.”

7. According to a recent study, “Only 35% of sampled Americans (among the lowest of surveyed countries) agreed that products and services using AI had more benefits than drawbacks.” Nestor Maslej, et al., “The AI Index 2023 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2023. Source.

8. A survey of 399 undergraduate and postgraduate students from various disciplines in Hong Kong revealed a generally positive attitude towards GenAI in teaching and learning. Cecilia Ka Yuk Chan and Wenjie Hu, “Students' voices on generative AI: perceptions, benefits, and challenges in higher education,” International Journal of Educational Technology in Higher Education 20, no. 43 (2023): 1-18. Source.

9. For more examples of language you might borrow for your syllabus, see: “Classroom Policies for AI Generative Tools,” a crowdsourced Google Doc organized by Lance Eaton, the Director of Digital Pedagogy at College Unbound in Providence, RI; and Boris Steipe et al., The Sentient Syllabus Project: Charting a Course for the Academy in an Era of Synthesized Thought, founded December 2022. Source. The project's website, print materials, and Substack includes guides for understanding AI issues, sample text for a syllabus, and course activities involving AI.

10. Also see: Rhiannon Williams, “AI-text detection tools are really easy to fool,” MIT Technology Review, published July 7, 2023, source; and Andrew Myers, “AI-Detectors Biased Against Non-Native English Writers,” Stanford University Human-Centered Artificial Intelligence blog, published May 15, 2023, source.

11. See the case of Robert Williams and the Detroit Police Department from January 2020. Kashmir Hill, “Wrongfully Accused by an Algorithm,” The New York Times, June 24, 2020. Source. A National Institute of Standards and Technology Interagency or Internal Report from December 2019 found that "With domestic law enforcement images, the highest false positives are in American Indians, with elevated rates in African American and Asian populations; the relative ordering depends on sex and varies with algorithm. We found false positives to be higher in women than men, and this is consistent across algorithms and datasets. This effect is smaller than that due to race." For details of the corpus of images used by the NIST in their study, see: Patrick Grother, Mei Ngan, and Kayee Hanaoka, “Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects,” National Institute of Standards and Technology Interagency or Internal Report, published December 2019, 2. Source.

12. Sayah Kapoor and Arvind Narayanan, “Quantifying ChatGPT’s gender bias,” AI Snake Oil, April 26, 2023. Source.

13. Nestor Maslej, et al., “The AI Index 2023 Annual Report,” AI Index Steering Committee, Institute for Human-Centered AI, Stanford University, Stanford, CA, April 2023. Source. According to the same report, “Global AI private investment was $91.9 billion in 2022, which represented a 26.7% decrease since 2021.”