Most of the final exam in Andy Olstad’s quantitative foundations class in the Oregon State University College of Business is focused on cut and dried statistics. But every term, Olstad gives students a chance to describe in an open-ended essay their experiences in his rather challenging class, hoping they will pinpoint the virtues that have helped them persevere.

Now, a computer can answer that question.

Earlier this year, Olstad fed his prompt into ChatGPT, an artificial intelligence chatbot from OpenAI that burst into public consumption last November and continues to get more advanced. ChatGPT can write like a human. And when it answered Olstad’s prompt, he couldn’t distinguish the computer’s answer from a student’s.

“That was really shocking,” Olstad said. “I figured, ‘Oh, well, I’ll be able to tell the difference.’ I really couldn’t and that was an eye-opener.”

As a new school year gets underway, professors at Oregon’s colleges and universities are racing to adapt their teaching to publicly available artificial intelligence, a technology with the potential to be as disruptive as calculators or the internet. While advances in artificial intelligence open new avenues for students to skip steps or cheat on assignments that professors have already seen put to use, colleges and universities are also encouraging faculty to get familiar with the technology to incorporate it into their teaching.

That was Olstad’s approach last year. On his spring final, he told students they could use ChatGPT for the essay, or they could write it themselves. If they used artificial intelligence, they had to include a list of the prompts they’d asked the bot.

Only a quarter of Olstad’s students used the bot, he said. The ChatGPT essays were consistently mediocre, he said, with a few exceptions. One of his highest scoring students used ChatGPT for help. She turned in nearly 40 pages of back-and-forth with the AI, showing Olstad how she engaged with the prompt and refined the bot’s suggestions to make the essay her own.

It was reassuring, Olstad said, to see students think deeply about the question, even if they were answering it with the technology. On the other hand, Olstad said he may need to reconsider how to work with students who are less excited. The old idea that students had to engage with an essay to get it done just doesn’t hold anymore, he said.

“I think we’re in for huge changes,” said Olstad, who is also the director of teaching and learning excellence in the business school. “It’s not going to be possible to ignore AI and keep teaching well.”

AI generated question example

Reporter Sami Edge asked ChatGPT to answer a prompt similar to Olstad’s in 300 words. The bot generated this answer in seconds. To read its full answer, visit: https://bit.ly/3t2dZg2

As the 2023-24 school year kicks off at most Oregon’s public colleges and universities next week, it will largely be up to individual professors to decide how they will tackle artificial intelligence in the classroom.

In surveys of North American professors this year, academic publishing company Wiley found that most believe their students are already using AI in the classroom. Only 31% of the professors said they felt positive about the technology.

Some of Oregon’s largest institutions, including the University of Oregon, Oregon State and Portland Community College, have created faculty cohorts to help educators learn about artificial intelligence, pinpoint their concerns and guide them to think about whether and how the technology might be useful to their lessons or students. At an AI symposium next week, Portland Community College will teach faculty how to use ChatGPT.

“It is a tool we will need to figure out how to embrace where appropriate,” Jen Piper, a dean in the community college’s business school, said in an email. “Students will need to learn how to be good users of AI tools as it will impact their ability to be successful in the workplace.”

During a back-to-school webinar in mid-September, members of Oregon State’s Higher Education AI Task Force urged faculty to play with AI tools if they hadn’t already, communicate clearly with their students about whether they were allowed to use AI in the course and feed their assignments through tools like ChatGPT to see what the bots can do.

At some institutions, professors are reporting more AI-related academic misconduct. Since February, AI misuse in coursework has made up 50% of all academic misconduct cases at the University of Oregon, Dean of Students Marcus Langford said. UO has also amended its policies on plagiarism to include the use of AI, spokesperson Angela Seydel said. Portland Community College saw a 10% increase in misconduct reports related to AI in the spring, student conduct manager Charisse Loughery said, but homework sharing websites like Course Hero and Chegg were still a larger problem. Professors have expressed concerns about the impact the technology will have on critical thinking and writing, which are often intertwined.

Jaden O’Hara, a sophomore at the University of Portland, said he first learned about ChatGPT last year from his cousins. At first he didn’t believe there was an AI that could write an entire essay for him – until he started to play around. As an engineering student, O’Hara has few classes where he’s required to write essays and says he hasn’t used ChatGPT for the ones he does have. But he did use the bot for help with Calculus 2, asking it to explain concepts he didn’t understand and give him practice problems.

“It’s a pretty useful tool,” he said.

A professor with a short blonde bob is angled toward the camera, gesturing with her hands. She sits with several students around a table and they're looking at her as she speaks.

Jacqueline Van Hoomissen, associate dean in the University of Portland College of Arts and Sciences, talks with students during a class on “exploring and using generative AI tools.” Van Hoomissen co-teaches the artificial intelligence course with faculty from math and other disciplines who pulled together the one credit seminar over the summer in response to rapid developments in AI.

Inara Scott, a senior associate dean in Oregon State’s business school, wants faculty to see the promise of AI, not just its perils. Several professors in her school are using it in the classroom, and the business school has created a set of icons to show students the extent to which AI is allowed or expected in their classes. In a data analytics course, for example, a professor is allowing students to use ChatGPT to help them write code. Coding has been a hurdle for some students, Scott said, and the technology helps move them more quickly to the actual goal of the class, which is to analyze the data.

Scott thinks the new technology demands that faculty think critically about what they’re teaching, how they’re teaching it, and why.

“Students hold an enormous amount of power in this new world. They can do our assignments, most of the time, with the AI,” she said. “So I have to be really good at designing creative, engaging, authentic assignments that students want to do and see the value of.”

The University of Portland is hoping to position its students to take on leadership roles in the AI field, said Valerie Banschbach, dean of the university’s College of Arts & Sciences. She thinks the continued advancements in AI will underscore the value of a liberal arts education that has taught students to think critically about the ethical and social implications of the technology.

“AI parallels things like climate change. If you’re viewing it as a threat and a problem, it really requires not just the engagement of the technologists, the engineers, the scientists, but the humanities folks who have that grounding in cultural awareness,” Banschbach said. “If we’re going to turn AI into an opportunity for humanity instead of a huge threat that has the potential to completely sink us, we’ve got to have the full interdisciplinary conversation.”

A college student in a brown sweatshirt with pigtails looks at her computer screen, which has tiles showing car-like furniture generated by Artificial Intelligence.

Jillian Nguyen, left, and James Dong use AI to generate images of transportation inspired furniture in a class on using AI tools at the University of Portland.

Ethics debates over the technology were on full display this week in a one-credit class on exploring generative AI tools that a team of University of Portland faculty pulled together over the summer. Jon Down, a business professor, walked the class of engineers, nursing students and physics majors through how to use platforms like DALL-E 2 and Midjourney to generate images. Down had uploaded an image of a painting from a local gallery and asked the bots to create variations of it.

“Would you say that’s art?,” Down asked the class. “Do you get any credit for the art or is that just the system?”

Alex Melemai argued yes. Even if AI makes an image, Melemai said, “you made the prompt.”

Claire Beaumont disagreed. “I don’t necessarily think that the person who typed the prompt into an image generator would be able to claim any credit as being the artist or even originator of that work,” she said.

Academics are also wrestling with questions of how AI will impact equity in education. Artificial intelligence can help level the playing field for disadvantaged students in some ways, professors say. For example, AI can allow English Language Learners to feel more confident in their writing by using Chat GPT to check their work, University of Portland professor Jacqueline Van Hoomissen said. Scott thinks it has the potential to expand access to effective, individualized tutoring. If AI is freely accessible to everyone, it could help with equity, said Regan Gurung, associate vice provost at Oregon State. But if future iterations of ChatGPT or other technology come at a price, that raises new issues, he said.

“If we get to that point where only students who can afford a certain subscription can turn in those good papers, that’s clearly problematic,” he said.

Melemai, a senior at the University of Portland, has found ChatGPT useful as a tutor. Melemai is in a class that’s making an electric vehicle, but has little experience in that arena and will sometimes ask the bot to explain to him what parts of the vehicle do. He also uses it to check his work on short answers, prompting the bot with questions like: “Would I be correct in saying…”

Melemai said it’s like a smarter version of online forums that he has used in the past to search for experts on certain topics.

As a computer science major, Melemai thinks the possibilities of AI are limitless. He’s considering working in that field in the future.

“I think it’s going to change everything. It’s probably like the biggest thing since the internet,” he said. “We’re at such an infant stage and it’s already such a big deal.”

This story was brought to you through a partnership between The Oregonian/OregonLive and Report for America. Learn how to support this crucial work.

Sami Edge covers higher education for The Oregonian. You can reach her at sedge@oregonian.com or (503) 260-3430.



Source link

author-sign