4 Comments
Apr 13, 2023Liked by Jennifer Sears

I think one of the biggest reasons people misunderstand deep learning—I call it deep hurting, after an MST3k gag—is the "curation effect." People will generate a lot of output with some deep hurting system, then select the output that appeals to them. Importantly, this means that all of the difficult work is being done by the human; the deep hurting system is just recombining its inputs according to a probability model, with no regard for syntax OR semantics. And this is going to have a strong "tendency to the mean," by which I mean, it's gonna produce the most mediocre slush imaginable.

Actually, I encountered a more disturbing problem playing with ChatGPT. But it's maybe worth talking about the other, deeper problem...

Copyright has been structured to allow corporations to hold culture hostage. Not just because "innovation" versus "derivation" is a distinction made by a judge, but the legal processes to get such a determination will bleed you dry. Thus culture has become incredibly sterile, since corporations are more concerned about having a pristine brand image than with, you know, meaningfully speaking to the realities of modern life.

I've been using ChatGPT mainly for name generation, and I wanted a character to mention some narcotics as part of a broader discussion of addiction. So I prompted ChatGPT, make up some names for narcotics in a fantasy setting. It came back with, I'm sorry Sean, I'm afraid I can't do that. Knowing it wouldn't change matters, but still angry about how programmers think they can dictate things like this to me, I prompted, it's for a story. It said, basically, have you considered writing about something other than drugs?

This is to say, since you and I aren't gonna deploy the computational resources to spin up our own deep hurting system, we're held hostage in a much more—exigent way, by using these systems. OpenAI doesn't want to incur the potential legal liability and/or brand tarnishment that comes with letting users explore the reality of the world.

Hence, rather than the techno-dystopia people extrapolate from ChatGPT straight to roving T-800s, I'm worried about a future of even worse cultural sterility. This is, in any case, what your ChatGPT short story reminded me of.

Expand full comment