At this year's faculty retreat, the Keynote address was given by Dr. Rafe Steinhauer, an Instructional Assistant Professor at the Thayer School of Engineering at Dartmouth College. In this breakout session, I explicitly put to words my current core pedagogical values. These values were to promote the synthesis of information and divergent thinking while encouraging active learners to find and express their voices.
Additionally, I attended two very interesting session talks in the category of "Creative Pedagogy and Assessment." These two talks focused on the technological enhancement of learning and the classroom.
The first session talk was given by Jessica Burke, the Assistive Technology Coordinator for PAL at Curry College. It was titled "Assistive Technology for the Classroom: Supports to Enhance Student Learning and Engagement." She showed us many useful multimodal tools to assist with teaching, which included real-time subtitles on PowerPoint garnered via a microphone. Additionally, she presented the benefits of using the lesser-used browser Microsoft Edge for reading websites and content aloud, the immersive reader, and allowing split screen visualization. Finally, various dictation programs were shown to enable those with writer's block to speak out their stream of consciousness and those with motor coordination difficulties to have text typed for them easily. Of course, this would require editing after the dictation. Various programs for reading textbooks aloud were also presented. Many of these things can be useful for faculty themselves, to suggest to all students for their benefit, in addition to supporting those with learning differences. Aside from the content, Jessica was gregarious, cheerful, and helpful, which lit up the room and put everyone at ease to freely ask questions.
The second session talk was given by Brooks Winchell, the Executive Director of Online Learning at Quinsigamond Community College. It was titled "AI Powered Student Success: Custom AI Tools and Educational Applications." Prior to the session, I spoke with the speaker about what we faculty have been experiencing as a major problem with AI, as many students have their papers written by AI via chatbots and Open AI LLMs (large language models). We discussed how this defeats the purpose of students learning the material, synthesizing the information, and writing a paper themselves to show their knowledge. Brooks agreed this was definitely a problem with assessing a student's learning.
During this session, Brooks presented various AI tools for creating animated videos, music, and even programming code. Most interesting was the Poe chatbot, which integrated multiple AI chatbots into one simple framework. There was some backlash from the faculty, especially computer science professors, that having AI write the code for you was not you doing your own work. They stated this often led to sloppy code via circuitous coding. The counterargument was that typical coding is all built upon other people's work and that the regular use of code and adapting it to one's own purpose was expected. In essence, it is built upon a foundation where one is not expected to reinvent the wheel. Mathematicians in the room were concerned about AI calculating problem sets for their students. Brooks explained that was not readily possible, but then he also explained that he ran a data set through AI, which provided an output. When I asked Brooks for clarification on whether the output gave him statistical results, he confirmed it did. I then questioned him about how this was possible if AI was not doing calculations for it to perform a statistical analysis. He gave a vague answer about comparing it to online databases, which, in essence, such a comparison would be based on mathematical calculations, if not readily conducting a statistical analysis on the data itself. My colleague, Dr. Weiser, mentioned that he had put data into ChatGPT and had it run a regression analysis, which it had done successfully. Still, my colleague warned that this needed to be checked against other external statistical analysis programs to verify. In such a case, what would be the point, as you would need to do the analysis on your own anyway to verify it? It would simply be a test of AI, which may not always be accurate.
Brooks' tools and suggested uses appeared practical, but he mentioned that the ethics of using these tools within specific circumstances were not his expertise and not something he could address in detail. The bottom line was that Brooks agreed that students' use of AI to do all of their work for them defeated the purpose of education, which mirrored the views of all the faculty present. However, the proper and ethical use of AI could benefit faculty and students in various ways in creating outlines, syllabi, brainstorming, and even mock job interviews and providing us with feedback with a digital AI voice interacting with us. However, some of these are against our current college's explicit policy.
At the end of the second session, I decided to use AI against AI misuse. I asked ChatGPT, "What can a professor use as an assignment for students in an Introduction to Psychology course at the college level in lieu of a term paper, specifically to prevent students from using AI to write their papers for them?" Here is the output from ChatGPT on 5/22/2024:
To design assignments for an Introduction to Psychology course that are less susceptible to AI misuse, consider the following options:
Oral Presentations:
Students can prepare and deliver a presentation on a specific psychological concept or study.
They can be assessed on both the content and their delivery, including the ability to answer questions on the spot.
2. Class Debates:
Organize debates on controversial psychological topics.
Students must research their positions and present arguments and counterarguments.
3. Reflective Journals:
Assign weekly reflective journals where students relate psychological theories to their personal experiences or current events.
This ongoing assignment is harder to outsource and can reveal individual thought processes.
4. Case Studies:
Students can analyze a specific psychological case study, applying theoretical concepts to real-life scenarios.
Include an in-class component where students discuss their findings.
5. Group Projects:
Form small groups to work on a comprehensive project that includes research, a presentation, and a written report.
The collaborative nature and multiple deliverables make it difficult for AI to complete the work.
6. In-class Writing Assignments:
Conduct short writing assignments during class time on topics covered in the lectures.
These can be impromptu and timed to ensure authenticity.
7. Creative Projects:
Encourage students to create something artistic (like a short film, podcast, or artwork) that illustrates a psychological principle.
Accompany the creative work with a brief explanation of the concepts depicted.
8. Peer Teaching:
Have students take turns teaching a section of the course material to the class.
This method helps ensure they understand the material deeply enough to explain it to others.
9. Role-playing Exercises:
Design scenarios where students must role-play different psychological theories or disorders.
Assess their understanding and application of concepts through their interactions and discussions.
10. Annotated Bibliographies:
Require students to compile and annotate a bibliography of relevant psychological literature on a specific topic.
They should explain why each source is significant and how it contributes to the understanding of the topic.
By incorporating these methods, professors can create a dynamic and interactive learning environment that minimizes the likelihood of AI being used to complete assignments, while also fostering deeper understanding and engagement with the course material.
Note: 10. Annotated Bibliographies can be made using AI. Brooks showed us in session how this can be done.
Comments