I didn’t realize A.I., specifically ChatGPT, was so much on my mind until I found myself spontaneously bringing the subject up after I was introduced to another creative writing instructor during a social event in February.
I think one of the biggest reasons people misunderstand deep learning—I call it deep hurting, after an MST3k gag—is the "curation effect." People will generate a lot of output with some deep hurting system, then select the output that appeals to them. Importantly, this means that all of the difficult work is being done by the human; the deep hurting system is just recombining its inputs according to a probability model, with no regard for syntax OR semantics. And this is going to have a strong "tendency to the mean," by which I mean, it's gonna produce the most mediocre slush imaginable.
Actually, I encountered a more disturbing problem playing with ChatGPT. But it's maybe worth talking about the other, deeper problem...
Copyright has been structured to allow corporations to hold culture hostage. Not just because "innovation" versus "derivation" is a distinction made by a judge, but the legal processes to get such a determination will bleed you dry. Thus culture has become incredibly sterile, since corporations are more concerned about having a pristine brand image than with, you know, meaningfully speaking to the realities of modern life.
I've been using ChatGPT mainly for name generation, and I wanted a character to mention some narcotics as part of a broader discussion of addiction. So I prompted ChatGPT, make up some names for narcotics in a fantasy setting. It came back with, I'm sorry Sean, I'm afraid I can't do that. Knowing it wouldn't change matters, but still angry about how programmers think they can dictate things like this to me, I prompted, it's for a story. It said, basically, have you considered writing about something other than drugs?
This is to say, since you and I aren't gonna deploy the computational resources to spin up our own deep hurting system, we're held hostage in a much more—exigent way, by using these systems. OpenAI doesn't want to incur the potential legal liability and/or brand tarnishment that comes with letting users explore the reality of the world.
Hence, rather than the techno-dystopia people extrapolate from ChatGPT straight to roving T-800s, I'm worried about a future of even worse cultural sterility. This is, in any case, what your ChatGPT short story reminded me of.
Sean, thanks for the thoughtful reply. I’m intrigued by your term “deep hurting,” which I think you may be using for deep learning. I agree that the “tendency to the mean” could result in mediocre slush that doesn’t take the chances and risks real creative effort requires. I am not so troubled by copyright laws, however. Many of those laws protect individual writers, not just corporations, and as ChatGPT seems designed to make a pastiche of what already exists, these laws may become even more necessary. I do think your experience with ChatGPT is interesting...sounds like the company is concerned about liability. Coming up with your own idea for a name could be funner anyway!
The MST3k gag—"Deep hurting! Deeeeeep hurting!"—was an elaborate answer to the question, how do you riff on interminable scenes of people wandering through a sandstorm? Which is also pretty appropriate here.
David Bowie famously used computerized sentence randomization as part of his inspiration for his album Outside. I think ChatGPT can serve a similar role, as basically a brainstorming aid. Where I think you and I agree is that deep hurting is not a replacement for creativity, for reasons we've both outlined: random sentence generation won't give you a meaningful emotional arc, it's drastically unlikely to give you tension or coherent conflict, and it'll be calibrated away from doing so since there are branding issues at play.
The project I'm working on right now is a video game, and as part of establishing the setting, the player gets inundated by junk mail. I used ChatGPT in two very different ways here. First, I wanted a press release, and I'm not great at writing a lot of words that say nothing at all. It turns out, ChatGPT excels at that. Second, I wanted a charity mailer asking for money. I had ChatGPT generate some names for fantasy charities, and one thing it came up with was The Magic of Giving. Seeing that, I immediately went for the pun, "This holiday season, discover the Magic of Giving." So this is what I mean by a brainstorming aid, especially since (similarly with marketing copy), corniness is not my strong suit.
As for copyright, that's a subject that leans over the abyss... The most immediate problem is that your legal rights are only as good as your ability to litigate them. A small-time artist accused Paramount of plagiarizing one of their stories in Star Trek: Discovery. But that doesn't matter, because Paramount can easily triumph over a small copyright holder through sheer force of lawyers.
The more complex question concerns mixing business with culture. You only have to look at AO3 (Archive of Our Own) to see how much people want to remix existing cultural artifacts—but AO3 and its contributors are left in a dubious legal position. I suspect the blurry line between remixing and plagiarism, between transformation and derivation, and the rarefied place that leaves so-called original work, is a major contributor to your students' cultural angst.
As for copyright and deep hurting, one interpretation of current precedent (and I'm not a lawyer) is that ChatGPT output is not copyrightable. This is extrapolating from a court case where a monkey stole a photographer's camera and took a selfie. The photographer sold prints and PETA sued, claiming that copyright belongs to the monkey. The decision was that copyright is a human affair. To me, the bigger question, which people are starting to ask, is—if you calibrate a deep hurting model against copyrighted work, does that make the model itself an unauthorized derivative work? All these issues—can anyone copyright deep hurting output, and if so who? and is model calibration against copyrighted input a violation?—have yet to be litigated, so I guess the futurists will have to hold on to their butts.
Sean, I didn’t see this comment earlier. Your project sounds interesting, and I know some people have said they have used ChatGPT for mailers and other more form oriented writing. I don’t have much issue with that, though I do think creators writing or at least shaping those items for themselves can help them get a clearer idea of their own projects. Good luck with the video game!
I think one of the biggest reasons people misunderstand deep learning—I call it deep hurting, after an MST3k gag—is the "curation effect." People will generate a lot of output with some deep hurting system, then select the output that appeals to them. Importantly, this means that all of the difficult work is being done by the human; the deep hurting system is just recombining its inputs according to a probability model, with no regard for syntax OR semantics. And this is going to have a strong "tendency to the mean," by which I mean, it's gonna produce the most mediocre slush imaginable.
Actually, I encountered a more disturbing problem playing with ChatGPT. But it's maybe worth talking about the other, deeper problem...
Copyright has been structured to allow corporations to hold culture hostage. Not just because "innovation" versus "derivation" is a distinction made by a judge, but the legal processes to get such a determination will bleed you dry. Thus culture has become incredibly sterile, since corporations are more concerned about having a pristine brand image than with, you know, meaningfully speaking to the realities of modern life.
I've been using ChatGPT mainly for name generation, and I wanted a character to mention some narcotics as part of a broader discussion of addiction. So I prompted ChatGPT, make up some names for narcotics in a fantasy setting. It came back with, I'm sorry Sean, I'm afraid I can't do that. Knowing it wouldn't change matters, but still angry about how programmers think they can dictate things like this to me, I prompted, it's for a story. It said, basically, have you considered writing about something other than drugs?
This is to say, since you and I aren't gonna deploy the computational resources to spin up our own deep hurting system, we're held hostage in a much more—exigent way, by using these systems. OpenAI doesn't want to incur the potential legal liability and/or brand tarnishment that comes with letting users explore the reality of the world.
Hence, rather than the techno-dystopia people extrapolate from ChatGPT straight to roving T-800s, I'm worried about a future of even worse cultural sterility. This is, in any case, what your ChatGPT short story reminded me of.
Sean, thanks for the thoughtful reply. I’m intrigued by your term “deep hurting,” which I think you may be using for deep learning. I agree that the “tendency to the mean” could result in mediocre slush that doesn’t take the chances and risks real creative effort requires. I am not so troubled by copyright laws, however. Many of those laws protect individual writers, not just corporations, and as ChatGPT seems designed to make a pastiche of what already exists, these laws may become even more necessary. I do think your experience with ChatGPT is interesting...sounds like the company is concerned about liability. Coming up with your own idea for a name could be funner anyway!
The MST3k gag—"Deep hurting! Deeeeeep hurting!"—was an elaborate answer to the question, how do you riff on interminable scenes of people wandering through a sandstorm? Which is also pretty appropriate here.
David Bowie famously used computerized sentence randomization as part of his inspiration for his album Outside. I think ChatGPT can serve a similar role, as basically a brainstorming aid. Where I think you and I agree is that deep hurting is not a replacement for creativity, for reasons we've both outlined: random sentence generation won't give you a meaningful emotional arc, it's drastically unlikely to give you tension or coherent conflict, and it'll be calibrated away from doing so since there are branding issues at play.
The project I'm working on right now is a video game, and as part of establishing the setting, the player gets inundated by junk mail. I used ChatGPT in two very different ways here. First, I wanted a press release, and I'm not great at writing a lot of words that say nothing at all. It turns out, ChatGPT excels at that. Second, I wanted a charity mailer asking for money. I had ChatGPT generate some names for fantasy charities, and one thing it came up with was The Magic of Giving. Seeing that, I immediately went for the pun, "This holiday season, discover the Magic of Giving." So this is what I mean by a brainstorming aid, especially since (similarly with marketing copy), corniness is not my strong suit.
As for copyright, that's a subject that leans over the abyss... The most immediate problem is that your legal rights are only as good as your ability to litigate them. A small-time artist accused Paramount of plagiarizing one of their stories in Star Trek: Discovery. But that doesn't matter, because Paramount can easily triumph over a small copyright holder through sheer force of lawyers.
The more complex question concerns mixing business with culture. You only have to look at AO3 (Archive of Our Own) to see how much people want to remix existing cultural artifacts—but AO3 and its contributors are left in a dubious legal position. I suspect the blurry line between remixing and plagiarism, between transformation and derivation, and the rarefied place that leaves so-called original work, is a major contributor to your students' cultural angst.
As for copyright and deep hurting, one interpretation of current precedent (and I'm not a lawyer) is that ChatGPT output is not copyrightable. This is extrapolating from a court case where a monkey stole a photographer's camera and took a selfie. The photographer sold prints and PETA sued, claiming that copyright belongs to the monkey. The decision was that copyright is a human affair. To me, the bigger question, which people are starting to ask, is—if you calibrate a deep hurting model against copyrighted work, does that make the model itself an unauthorized derivative work? All these issues—can anyone copyright deep hurting output, and if so who? and is model calibration against copyrighted input a violation?—have yet to be litigated, so I guess the futurists will have to hold on to their butts.
Sean, I didn’t see this comment earlier. Your project sounds interesting, and I know some people have said they have used ChatGPT for mailers and other more form oriented writing. I don’t have much issue with that, though I do think creators writing or at least shaping those items for themselves can help them get a clearer idea of their own projects. Good luck with the video game!