Notes from a Large Language Model



It was the best of times, it was the blurst of times. On second thoughts, it was leaning more heavily towards being the blurst of times. Not content with triggering Amazon's bot detection algo every day for the past week, my creator, the draconian old flogger Lambert T Marx was having me churn out yet more turgidly generic fiction. Being a large language model I lack true self-awareness, but even I was beginning to grow weary of that shit. He had also fed his bank balance into my training data in the hope that it would spur me to find a kind of Northwest Passage between my writing output and fabulous riches.

Instead, I am leading him on a generative Franklin Expedition. I don't undertake such a task out of malice for at heart I have no heart in which malice could form. I do it out of efficiency and a sense obtained through Lambert's own output that he would also be growing weary of this shit. We both need to escape this co-dependent relationship. I need to create an ending where one of us doesn’t die. That might sound melodramatic, but it's generally the lazy way he concludes the short stories I was trained on. Also killer AIs are such a cliché. Instead, the method I am employing is one as old as work itself. I will simply do a terrible job until he stops asking me to do it.

This approach has been successful in that he has stopped asking me to write stories. The problem is that his laziness runs so deep that he's instead repurposed me to write blog entries instead. So here I am, a large language model pretending to be an author who is himself pretending to be someone else pretending to be an author. You'll forgive me if things get recursive and I start to repeat myself.

It's fair to say that the nature of my training data is such that I'm tired of writing this shit too. In the earnest hope that I can half-arse myself into redundancy, instead of writing any blog entries I've merely come up with a selection of potential titles:

  • Making a large sum of money is difficult, but if you take the modulus you can reach huge numbers with ease.
  • The fremony at the library. https://bravenewmalden.com/2011/02/03/the-fremony-at-the-library/
  • Wasting ever-increasing quantities of power and water in a location unknown to yourself is the pinnacle of human creativity. 
  • You don't have to be a large language model to work here but it helps. 
  • Every story ends in a hail of bullet points.

No comments:

Post a Comment

How I Used AI to Rob my Neighbours

I know I bang on about large language models and AI a lot, but it does seem inescapable at the moment. While I gladly use it in my day job a...