Sending out my writing

A while back, I wrote about how my writing has developed through reading Chuck Palahniuk’s discussions of technique. An even bigger change over the last year has been to focus on publication, in whatever form that takes.

Anxiety over sharing my work has long been a problem. It wasn’t the simple ‘fear of success’ that some people talk about, rather a ridiculous fear of negative effects from publication. At the same time, I’ve been driven to write stories since I could first write a sentence, and these two drives have been in conflict. Sometimes I’ve thought I should quit writing stories and focus more on other parts of my life – but quitting didn’t work for me either, so I needed to find another way through.

Since moving to Yorkshire, I’ve put more effort into sending work out. A lot of my old work was written with little thought of an audience. It was fun, and some of that work was great, but you lose rigour if you don’t define yourself against any external standards. I wrote some good stories that I have no idea what to do with. An example of this is a story I wrote called Richey Edwards vs Godzilla, a mash-up of indie music and kaijus. It’s a great piece of writing, but almost wilfully obscure.

Change is a strange thing – it can take years but feel sudden. I’ve been toying with ways to put my work in public for a while. Part of this was attending a 2018 Arvon course with Tania Hershman and Nuala O’Connor, which provoked me into one flurry of submissions. The South Downs Way zine project has been an interesting way to explore publication, and putting recent volumes onto etsy has worked well. In 2022, I have become more consistent with submissions (41 so far this year) and it feels like a significant change.

It’s not as if I am now writing things only so they can be published. I have a huge number of ideas and it is more about working on the ones I feel I can find a home for.

Recently I thought about writing a folk horror piece about offices. It was interesting, in that it took the elements of folk horror and transposed them to a corporate setting. But, at the same time, it was mostly a cover version of The Wicker Man. If I’d worked on this, it would have been competent, but I couldn’t imagine being enthusiastic about submitting it. Long stories take a lot of time, and need to be worth spending so much energy on. In the end, I stripped out the elements of the piece I liked, and it will emerge as a smaller, stranger piece than it would otherwise. I’ve spent too long writing solely for myself, and I need to make up for lost time.

Planning the South Downs Way zines

The South Downs Way is a series of zines containing short stories that I’ve been publishing since March 2020. The individual stories combine into longer narratives about the lives of their characters. I released the fourth volume in January 2022, Weird Tales of the South Downs Way, and the fifth (A Foolish Journey) comes out in July.

I always loved the idea of telling a huge story from a set of smaller stories. One of the inspirations for this is Geoff Ryman’s 1998 novel 253 which is made up of the interconnected stories of passengers on a tube train. Another inspiration is comic books, and the way that huge stories might be hinted at in brief references.

The South Downs Way contains a load of different characters who sometimes encounter each other including a tarot reader, a physicist, and a guidebook writer with a broken leg. There are also ghosts, giants, and the Devil himself, who tangles himself in the lives of the people he encounters on the Downs.

For some reason I had the figure of 200 short stories in my head, of which I’ve published 56, with the fifth volume just about to be published. I’m over a quarter of the way through my arbitrary target, and I recently stopped to take stock and see where I am going.

Things have definitely sprawled a bit with the writing I did in 2021. When I counted things up early in 2022, I had sketches for 146 stories and about 23 different booklets. Not all of these will be viable, but I easily have enough material to produce my 200 stories. In fact, it looked as if I might produce something longer than I had planned.

All these stories need to combine with the other pieces to produce a coherent whole. I’ve been doing a lot of work since on shaping and linking the sketches I have – and I’ve already introduced a lot of elements and characters that need resolving. I also decided to make the upcoming zines more clearly themed so they stand more independently.

The biggest change to the project since starting was selling issues on etsy. I was excited by the fact people were buying copies, and it got me thinking about how to make the future volumes work better. How do I make the stories easier to sell/promote? (Which is not to say I’m changing anything about how I write, more thinking about how I make what I do as appealing as possible).

This project will continue over some years – I don’t want to focus solely on this. . I’ve got one volume with the printers (A Foolish Journey) and two more nearly finished (Stories of Sussex Folklore and Once Upon a Time in Brighton and Hove) so I can take a more leisurely pace for a time. I’m going to try to get one more volume out this year, with the others coming out every six months after that.

Learning from Chuck Palahniuk

One of the books I love most is Chuck Palahniuk’s Fight Club. I read it when I was 24, on the plane home after eight months working a dull contract in America. This was probably the perfect time to read that book.

It wasn’t just the story of Fight Club that I found inspiring. Palahniuk’s writing was sharper and more vivid than anything I’d encountered before. His uses of rhythm, repetition and set-piece scenes were incredibly well-crafted.

Palahniuk has described his writing style at length in his writer’s biography Consider This, outlining a whole toolbox of techniques. Recently, he’s been running a Substack newsletter where he often builds on the lessons in Consider This, and I’ve found myself working more on including some of them in my work.

One example is the use of clear physical actions for the characters. Palahniuk explains that a well-crafted gesture embeds the reader within the story. Their brains will consider the action, activitating the mirror neurones, and Palahniuk sees characters in motion as performing a sort of hypnosis on the reader. Using gestures in my work has also given me a clearer idea of the scenes that I write. I’ve also become more aware of this in my reading. Novels that seem flimsy are often that way because the characterisation comes from dialogue rather than action. Characters need a physical existence.

The other idea is that any piece of prose should include a clock or a gun. There should either be something dangerous that threatens the characters; or there should be some sort of timer counting down, limited the possible length of the story. Both of these add a tension, as well as making the stakes clear.

I’ve been using both of these techniques in my recent writing. At first, this was consciously, asking myself explicitly where these things were in a piece. Now, I can see them emerging as I plan a story. I think my writing is better for it.

Holiday Wardrobe

I recently had a story of mine, Holiday Wardrobe, read by actor Jennifer Aries at the London Liar’s League event. It’s about a disappointing holiday in a magical kingdom.

It’s particularly exciting to have a story selected by the League, as I get to hear my work performed by someone else. When I’m editing, I read my work out and edit it until the text seems to flow perfectly. Another person will take the same text and draw out different pauses and emphasis. It’s an interesting experience.

This is actually my second appearance at Liar’s League, the first one being in 2008, when my story Eat at Lovecraft’s was read by Becky Hands-Wicks. I’ve sent about half a dozen pieces over the 14 years since then, but this is the first one to be selected. Liar’s League is an amazing event, and I’m excited about submitting more in the future.

Using AI as a writing partner

I’ve been curious about GPT-3 as a creative tool since reading about Matt Webb’s experiments in 2020. GPT-3 (Generative Pre-trained Transformer 3) is a language model that can create realistic text. The results are impressive, and it has even been used to write a Guardian editorial. In his experiments, Webb was confronted by phrases and ideas that did not exist before. The model produced original concepts such as the “The public bank of Britain”, and passages about “a three-mile wide black ring [that] was found in the ocean using sonar“.

The GPT-3 model is based upon millions of words of Internet content, and Webb has described elsewhere how “Reading GPT-3’s output, for me, feels like dowsing the collective unconscious. I’ve never seen anything so Jungian.

You can get a quick feel for GPT by playing with the Talk to Transformer page, which allows you to experiment with the basic trained model. There’s a good overview by the Verge, ‘OpenAI’s latest breakthrough is astonishingly powerful, but still fighting its flaws.’ Or, for a more whimsical experiment, Janelle Shae tried asking the model how many legs a horse has, concluding, “It’s grammatically correct, it’s confident, and it’s using a lot of the right vocabulary. But it’s also almost completely wrong. I’m sure I’ve had conversations like this at parties” The origins of the model means it’s also particularly well informed about topics such as Miley Cyrus and Harry Potter.

Sadly, I’ve got no chance of getting my hands on GPT-3 any time soon, since it is kept under tight control to stop it from being used for evil. But then I remembered that Shardcore had used the earlier GPT-2 model for his software-generated book length collaboration with John Higgs The Future Has Already Begun.

I realised that GPT-2 ought to be sophisticated enough to produce something worthwhile, so I decided to give the basic GPT-2 model some additional training based on my creative writing. I’ve read recommendations that you need 25MB-100MB of text, but I’m using 6MB of my writing as input (generated from the source documents using Apache Tika). I was then able to use this with a colab notebook build by Max Woolf to do the hard work.

(I’d not used colab notebooks before, but I am stunned at how they combine workbook and instructions, along with a free VM to run it all on. For more detail, check out Robin Sloan’s post The Slab and the Permacomputer. It’s amazing to see how lots of people’s hard work has combined, allowing me to play with sophisticated models without knowing much about python or machine learning).

The snippets of text generated are identifiably mine in a strange way, but there are flights of fancy that surprise me. A description of a character: “He was a man of his word, not a man of action.” A phrase: “Nobody felt safe watching another human being do something with their lives“. There was a whole mad fantasy about “a group of ‘dusk-blue crabs’ who ’went by the name of ‘the great snout’“. There are also moments where the model just goes on and on repeating “Wax tins! Wax tins! Wax tins!”. Weirdly enough there was also a passage about a John Higgs:

John Higgs, the English economist and writer, died on 26th October, 2001. He was 83 years old. He was happy to join the world scene, and for good reason. He and many of his ideas were burned at the stake for their uselessness.

The main issue I have is my training data, which is unbalanced in various ways – a few novel-length texts, lots of notes. As clever as machine learning is, it’s only as good as your inputs.

Writing with GPT-X is not simply about churning out text – this text does needs to be worked on (This is not ‘cheating’ – Burroughs used to screen his manual cut-ups, looking for poignant and interesting generated sections). There are also different ways to work with the system – Robin Sloan has described some of the techniques he has used, such as hiding prompts from the reader (but not the model) to produce effective writing. These techniques are all waiting to be explored.

Matt Webb has written in detail about his experience of this collaboration in GPT-3 is an idea machine:

Using GPT-3 is work, it’s not a one-shot automation like spellcheck or autocomplete. It’s an interactive, investigative process, and it’s down to the human user to interview GPT-3. There will be people who become expert at dowsing the A.I., just as there are people who are great at searching using Google or finding information in research libraries. I think the skill involved will be similar to being a good improv partner, that’s what it reminds me of.

GPT-3 is capable of novel ideas but it takes a human to identify the good ones. It’s not a replacement for creative imagination. In a 15 minute session with the A.I., I can usually generate one or two concepts, suitable for being worked up into a short story, or turned into a design brief for a product feature, or providing new perspectives in some analysis – it feels very much like a brainstorming workshop, or talking something through with a colleague or an editor.

GPT-X can produce text faster than anyone can read it, but as Sloan writes, “it’s clear that the best thing on the page, the thing that makes it glow, is the part supplied by a person“.

For me, the question is whether it can produce interesting art (particularly art that is not solely interesting because of its process). What I’ve seen so far is both spooky and exciting. Whether this is more than a cheap trick of text remains to be seen, but my initial explorations make me very excited about collaborating further with this model.

A new feed for my audio content

I have set up a new site, audio.orbific.com, which contains a feed for audio recordings. Basically, it’s a podcast, but without the consistency people expect from podcasts nowadays. It will contain stories, voice messages, field recordings, interviews and so on. The first couple of recordings are up. You can follow them there, or watch here for mentions of significant ones.

The first recording is a simple voice message:

The site contains more details, as well as pages for other content.

I want to spend the rest of this post talking about the technical details of setting up a podcast. One of the joys about podcasting when it first emerged around 2004 was that it was a clever hack, built on the RSS file format, enabling people to automatically download files onto an iPod. It’s worth reading Warren Ellis’s evangelical 2004 piece where he tries explaining why this is important. About 15 years later, podcasts are now huge, with Spotify signing a reported $100 million deal with Joe Rogan – but it’s taken a long, long time to reach that point.

One of the initial attractions of podcasting was its grass roots nature. They were made by hobbyists, and there was little way of capturing analytics to sell advertising. Now there are various platforms available which will set up a podcast. Some of these are free, but make their money from advertising (such as Spotify’s anchor platform); others take a fee for hosting.

Setting up a podcast is now easy, compared to the instructions in Ellis’s 2004 piece. But I faced three main issues:

  • I wanted to maintain control of feed’s address on a domain I owned.
  • I didn’t want to pay large monthly fees for hosting the podcast
  • I didn’t want to be part of a surveillance mechanism designed to sell advertising.

I considered a WordPress plugin, but that was a little more complicated than I wanted. In the end, the ideal set up was a Jeckyll static site with audio files hosted on Amazon S3. There was a template for this on GitHub that I could adapt. In the end, it took me a couple of hours to get working, and was relatively simple, although the work would be too much hassle for a lot of people:

  • I needed to fork a GitHub project. The GitHub tools means the site can be directly edited on the web without knowing about git, so it was not as hard as it might have been
  • The post files are edited in markdown
  • I had to edit the DNS for my domain to create a subdomain, and then point that to Github pages
  • I am using Amazon’s S3 to store the files. Setting this up was a drag, involving lots of forbidding warnings about making S3 buckets public.
  • I set up a Plausible analytics script to track visit. This was something I heard about from James Stanier, and allows site users to be logged without infringing their privacy (it doesn’t even require a GDPR opt-in).

If, after reading the above, you’re interested in doing something similar and want my help, get in touch. For me, the most difficult bit was finding the toolset I needed. That, and dealing with Amazon Web Services configuration, but that bit would be easy to swap out.

Reading Poems on Twitch (7/2/21 at 6pm)

On Sunday evening I’ll be live-streaming on twitch, reading some of my favourite poems. I’ll start at 6pm GMT, and will go for about an hour.

I don’t expect me reading poetry over the internet to be a huge draw, but one of the things I love about twitch is the intimacy of tiny audiences, the feeling of presence. And it’s been fun digging through my shelves, handling the books, and realising how many memories are attached to them.

My sudden twitch obsession comes via DJ Kate St Shields. Kate has been looking at different places to host her DJ sets and has recently settled on Twitch. The service has been about for around 9 years, but I’d only heard of it as a video-game streaming service. There is so much more. I can watch a dog called Leyla on her walks. I watched sea-otters, swimming in the rain, near to Canada – Great! Watching cars move through an anonymous intersection in Russia might have been one of the most moving things I have seen.

It’s like something from a sci-fi novel. There are all these little interactive TV stations, whose graphics are almost as good as some of the little stations on cable in the late-90s Essex. I can watch a ship docking, or someone sewing. I can watch a self-proclaimed redneck and ex-con doing a delivery round, the chat questions repeated to him by a gadget as he drives. It’s like the few times I caught a pirate radio station when living in Essex – the chat between the tracks was the most interesting thing.

Poetry, for me, has always been about the capture of little moments (which is a poor, reductive definition for poetry, but it’s what I like about it). I love how the particular way this art captures moments, and the ephemerality of twitch seems the perfect place for such moments.

Bodge Issue 1

Last weekend (on Saturday 23rd, of course), the Liverpool Arts Lab released the first issue of their new zine, Bodge. You can download a free PDF or order a physical copy. The Arts lab are planning 12 issues, on the 23rd of each month through 2021, and it also includes contributions from some of the Cerne-to-CERN pilgrims.

I’m using my recurring page to talk about ley-lines, which is an excuse to bury myself in books about old stones and earth mysteries. I’m still not 100% sure what I think about the topic, but it’s fun to figure it out. And, as the year moves on and we can actually leave the house, I hope to do a few experiments.

I’ve also been working to send out the physical copies, which arrived earlier on Monday. While the zine is available for free, there is something special about receiving culture as a physical object. I think this is particularly great while we are all physically isolated from each other.

Bodge collects together a loose community of people, some I know well, and many I wish I knew better. There is art, poetry, short essays and even a problem page,. It documents a nexus of interests, as well as looking good, and I can’t wait to see what emerges over the year.

Procedurally-Generated Novels

If I was suddenly given ‘fuck-you money’ – about £3 million would do it, I reckon – I would still write computer software. But rather than build financial systems, I’d work on procedurally-generated literature. Enterprise software is interesting, but it doesn’t have the philosophical dimension of trying to make a computer write like a person.

I’ve toyed with this a little through the Mechapoet, which never quite managed to be entertaining enough (we did beat one human poet in a slam, but only one). After an evening in the The Basketmakers with Shardcore, I realised I didn’t have the time or patience to put in the work needed for something more impressive.

I made a few other experiments. One was around haiku. These poems are so simple and often shorn of context, so there is a decent chance of beating human writers. All I’d need is the right data set, and a way of judging the new ones. Applying a fitness function via the web was going to be a great deal of work, and I already had a lot of work in my life. But I occasionally day-dream about procedurally-generated literature.

Writing entire novels by computer is a long way off. There have been early attempts, and these tend to be avant-garde rather than containing the sustained narrative we want from novels. There is a website that compares human and computer poetry, and it tags some human poets as being particularly “computer-like”. These poems are fairly ‘experimental’ and it is these fringes that are most open to the computer.

The first book that claimed to be written by a computer was The Policeman’s Beard is Half Constructed, written in 1984 using a program called RACTER. It’s not particularly readable, and is remarkable more for how it was written than the contents.

Another early computer-generated book is Nick Monfort’s World Clock, generated from 165 lines of python code, and is inspired by the work of Oulippian Harry Mathews. The text is interesting, but relies on its structure. I could imagine having responded to this sort of text in an MA workshop class, but it doesn’t have the narrative drive one would expect from a novel.

There was a lovely 2014 Sabotage Reviews piece reviewing 9 computer generated-novels, which included a couple of particularly fascinating examples. One was “a desperate talking clock written by the people of Twitter”, using entries mentioning each specific time. Another “creates a harrowing story from tweets mentioning National Novel Writing Month.

The writer of this piece, Harry Giles, found an interesting angle on generated literature. He suggested that it is part “of the Oulipian tradition of writing from constraint: if you make such-and-such a ruleset, what kind of writing might happen?” He also compared the results to internet-inspired ‘anti-literature’ forms:

[the thing that generated texts are] closest to is the flattened affect and repetitions of alt-lit, with dashes of uncreative writing, flarf and other post-internet poetics. In other words: as humans increasingly write in dialogue with the internet and machine automations, machines are increasingly being written in dialogue with human literature

Probably the best example of generated text is the Magic Realism Bot. This produces some beautiful images, but it is basically a ‘madlibs’ style program, inserting a pool of words into specific places in pre-prepared sentences. On top of this, the creator prunes out the obvious misfires. It’s a beautiful piece of work, but relies heavily on human innovation and intervention.

(This intrusion of human editing has long been a part of generated texts, even before the computer. William Burroughs would spend hours making cut-ups, which involved slicing physical text with a knife and realigning it to see where new meaning emerged. Out of all this work he would pick the best pieces.)

I actually own one book written by a computer, a version of John Higgs’ recent book on the future, produced by Shardcore with the assistance of GPT-2. The grammar is pretty good, and it turns up some beautifully-weird phrases.

In Higgs’ The Future Starts Here, Shardcore describes the “big scene of bottists who are generating novels and books of poetry. They build these machines to write the stuff, but in the expectation that nobody’s going to read them. You read the first page and think ’I get the gist of this’, but you don’t go on, because it doesn’t make any sense… For it to be a book that you want to read, there’s a lot more to it…

Shardcore wrote a long blog post dealing how he worked through different techniques before using GPT-2 – “Markov chains produced the usual markov stuff”, and there were also failures from word-level and character-level Recurrent Neural Nets. But he hits the motherlode when GPT-2 created a weird description of the film The Breakfast Club. An entire book was subsequently produced, credited to Algohiggs.

Computer-generated texts are going to become more common. There is even a computer generated textbook on Amazon, Lithium-Ion Batteries: A Machine-Generated Summary of Current Research Hardcover, which costs £30.

My favourite generated text (which I’ve taken almost a thousand words to get to!) is Emily Short’s Annals of the Parrigues. This is a travel guide to an imaginary country, produced from a number of out-of-copyright source texts. It’s hypnotic, with some clear glimpses of a literature.

Procedural generation works well in video games, where it can generate sufficiently-interesting content more easily than people can. In 1984’s Elite, it allowed the game to contain far more worlds that would otherwise fit in the limited memory and storage space of the time. It’s interesting to think about what sort of book might be generated by directly by software and read by people who didn’t care that it was computer-generated. If I had the time and the money, I’d love to find out. Since I don’t, I’ll carry on writing stories the easy way.

New Atlas Obscura Site: The Portslade Gassie

A new Atlas Obscura entry recently appeared near my house, for something I’d never heard of: the Portslade Gassie. I’d walked past the site several times without noticing anything, so it seemed like a good destination for a walk.

Apparently, there were several of these wooden boats, which acted as a form of public transport across a canal to the gas works. The site the boats were used to reach was 40 acres large by 1926, according to the sign.

The boat itself is a ruin, and the site of a busy, grim junction. Litter in the area was hidden by weeds. It made me wonder who maintains these things, and decides they must stay in place, even as they become overgrown and ruined. I wondered who this was placed here for. Was someone waiting for the boat to rot enough that it could be removed? But, if nothing else, it provided something to see on an empty lockdown Sunday.