Thursday, February 23, 2023

#54 / Tesla And ChatGPT: Some Common Elements

The San Francisco Chronicle featured an "Open Forum" article on its editorial page on February 1, 2023. The article was authored by Joshua Pederson, a professor of humanities at Boston University. My search for information about Pederson indicates that he gets really high "Rate My Professor" scores from his students. Pederson's article was titled, "New tech, same old student cheating." The article addressed some potential problems with a recently-released "Artificial Intelligence" program called ChatGPT, developed and now made available to the public by OpenAI
Here's how OpenAI describes its new invention: 

We’ve trained a model called ChatGPT which interacts in a conversational way. The dialogue format makes it possible for ChatGPT to answer followup questions, admit its mistakes, challenge incorrect premises, and reject inappropriate requests. ChatGPT is a sibling model to InstructGPT, which is trained to follow an instruction in a prompt and provide a detailed response.
Pederson reacts to this new AI program as follows:

As students across the country settle into a new semester, their professors are trying to figure out what to do about ChatGPT — the free artificial intelligence writing tool capable of producing surprisingly realistic prose in response to just about any prompt you can come up with. Already, the tool has been used to compose publishable stories for tech websites, ad copy for wireless commercials, and even a pretty good Jerry Seinfeld-style joke about airplanes.

My colleagues in academia are worried about something else: Can it write college essays?

Well, what if ChatGPT can write college essays? (Hint: it definitely can). Pederson is not really too upset. Here's his reaction: 

If computers can write papers that are essentially indistinguishable from ones produced by students, how should we respond? Should we create new assignments? Develop new strategies to discourage cheating or have students handwrite their essays in class? Or do we abandon such assignments altogether?
I’ve taught writing at the university level for nearly two decades now, and I have a simple answer: We should do nothing at all. Or at least nothing we haven’t been doing already...
ChatGPT doesn’t present professors with a problem we haven’t seen before. Don McCabe, a Rutgers University professor sometimes referred to as the founding father of academic integrity research, conducted a 13-year survey from 2002 through 2015 which found that 62% of undergraduates admitted to having cheated on a written assignment at least once. The reality is that plagiarism has long been a fact of life at American universities. ChatGPT just gives students a new tool to accomplish this very old task.
It is, as Pederson's title reads, the "same old student cheating." Not much new here!
Before commenting more directly on Pederson's idea that student cheating is just the normal way that students relate to academic assignments, let me note that there seem to be some common elements in what Tesla is doing, aiming for 100% self-driving automobiles, and what OpenAI is attempting to do with its ChatGPT program. In both cases, the companies are trying to eliminate any need for actual human beings to be involved in activities which have historically been understood to demonstrate and exemplify human mastery. 
Let's phase ourselves out, in other words. That seems to be the suggestion. As indicated in my earlier blog posting about Tesla's self-driving program, I think the most pertinent question is "Why?" Why would we want to remove human beings from the equation? We know that it will be difficult to train a computer to match human reasoning - either with respect to writing something worthwhile or driving a car in a difficult situation. Maybe, although it might be hard to do, we could, nonetheless, do it. The real question for me is "WHY would we want to do that?"
I do think, as earlier suggested, that this is a profound and truly "existential" question, and as a college professor myself, I can't say that I am as unconcerned as Pederson seems to be about what ChatGPT may mean for education. 
First, my own teaching experience would never lead me to use the phrase, "same old student cheating," as though "student cheating" is the norm, and is just something to be expected. In my thirteen years of teaching a Senior Capstone course in the UCSC Legal Studies Program, I have only had one occasion in which I found that a student had pursued what Pederson described as "the old-fashioned way" of cheating — by buying a student paper online. 
Second, the Senior Capstone course I teach explores "Privacy, Technology, And Freedom." It is my opinion, based on a lot of reading, that technology is becoming ever more powerful, and that what technology can do, today, is just a pale adumbration of what it will be able to do in the future. Maybe today's ChatGPT isn't all that great, at least based on that Seinfeld-style joke (track it down and read it, to see if you agree). But technology will get a lot better; we can be sure of that.

As Pederson says in his article - and I completely agree with this:
Writing helps me think better. I often don’t know how I feel about a particular topic until I write it down, until I force myself to compose my thoughts and commit them to the page (or the screen). I tell them [his students] that writing is a way to make sense of our lives and experiences, to give order to the story we tell ourselves about ourselves. And I tell them that good writing can be a joyful experience.
Any student who believes what Pederson has just said, as quoted above, will not want to cheat, because cheating for a "grade" is actually a way for a student to cheat herself or himself. That said, however, pressures on students to achieve high grades in their classes are very great, and mounting. When education is ever more expensive, and when education is ever more seen as a type of "credentialing" activity, in which passing courses, getting good grades, and then graduating with a degree that can lead to a higher-paying job is actually the main and essential motivation for attending school, the ability of Artificial Intelligence to help students achieve those higher grades could, in fact, tempt students to do whatever they need to do to get those higher grades, even if that will actually undermine and impoverish students' efforts to make their own education their primary objective. 

Because that could be true, I'm nervous when I hear about a new AI technology that can write student papers that may end up being indistinguishable from papers written by students who have themselves researched and mastered the topics they are writing about.

I am also nervous when I hear about efforts to create a completely "self-driving car." Phasing human beings out of the equation (whatever equation) is not my idea of "progress."

Image Credit:

No comments:

Post a Comment

Thanks for your comment!