Saturday, May 13, 2017

Burst around truth: an excursion in recurrent neural network poetry

The web interface for my (I mean Lech's) poetry machine

This post is scheduled to go live during the Dunedin Writer’s Festival event on Found Poetry. The event’s description reads:

Like hunter-gatherers armed with pens, paper and acute observation skills, six Dunedin writers have composed poetry collected from the fabric of the city. Come along and fill up on a diet of found poetry foraged for you by some of our finest.

False starts

I’ve been living in Dunedin just shy of four months, which is an awkward length of time. It’s too long to be able to feel like a tourist and blithely, touristically, make poetry without feeling every misstep. But it's too short to know exactly what I want to say, even if I’m to say it with other people’s words.

Point and case, my first misstep when I knew I had to find some poetry. The idea came to me when I first visited Forsyth Barr Stadium (for the Warriors v Bulldogs, alas, rather than a Highlanders game – spot the out of towner!). I looked at the names of the stands. There was the Speights Stand (of course), the Mitre 10 Mega Stand (fitting as the huge orange shed is visible from pretty much every lookout), the Otago Daily Times stand, which is only there when it’s needed (eg for All Blacks games, super rugby finals, not NRL games), which seemed to say something about traditional media and local media. Very quickly I thought of a range of other temporary, interstitial stands that could be inserted in the seating plan of Dunedin’s stadium to give a more meaningful map of what it means to live in Dunedin in 2017. Emersons needed a stand beside Speights, for starters. And what were the local businesses displaced by, or put out of business by, Mitre10 Mega?

But the more I looked at my seating plan, the more I realised, I didn’t know enough.

That, and making a seating plan poetic seemed to pull the thing in less meaningful, but funnier directions (The Lemonade Stand stand, The Stand By Your Man stand).

Verdict: abandon!

Moving on

My next and more enduring project was inspired by the screens in Toitu (Otago Settlers Museum) that played Dunedin Sound music videos. I sat there for ages, suddenly recognising streets and beaches of temporary hometown.
The music was something I could identify with, even before I’d moved down here. In my first week as the Burns Fellow I sat in Poppa’s Pizza and stared in awe at the orientation posters from the late eighties and early nineties: The Chills, Straitjacket Fits, The Bats, The 3Ds... It was one of those I just wasn't meant for these times times.

Back in 2008, while questing for a million words, I generated a lot of poetry with the help of Google Translate. I'd take song lyrics from bands like The Tragically Hip, Procul Harum and Monster Magnet, translate into (for example) Russian, then Indonesian, then Chinese (Simplified), then English, then Italian, then Latin, then English, then Welsh, then Hindi, then English. Each time the text slipped back into English, it’d scavenge any interesting collision of words and keep going until I’d built up a stockpile of phrases to then construct something new.

If you’re interested, here are some examples of finished poems that are available online:

  • Himalyan White (started life as ‘Nights in White Satin’ and ‘Whiter Shade of Pale’)
  • Glen Coe (started life as ‘Whipping Post’ by The Allman Brothers Band)
  • In Time (started life as ‘Limo Wreck’ by Soundgarden)
(I played around with other things too. A series of warped Shakespearean soliloquies. Translating poems from languages I didn’t know without any help — analog or digital (‘The Bumblebee’ came from my ignorant ‘translation’ from the Turkish of Mustafa Ziyalan’s poem ‘Ad Bulamadiğim’).

This is all to say I have an interest in alternative means of text generation, the place of the writer in the process, and the relationship between song lyrics as text and poetry — so it was not a very big leap to make to decide to take lyrics from Dunedin Sound bands and generate something new from this corpus.

The final piece of the puzzle is my interest in Recurrent Neural Networks (RNN). I guess I have a soft spot for AI that can’t quite pass the Turing Test. I can listen to ChatBots talking to each other for hours. And when I read a list of recipe titles generated by an RNN that had been trained on cookbooks, I seriously lost my shit.
 
Match-making
 
So, I wanted to learn more about RNNs, and after a bit of reading up I knew I couldn’t do it alone. One of the great things about being part of a university, however temporary, is the possibility of collaboration just seems that much more achievable. I reached out to Anthony Robins in the University of Otago’s Computer Science Faculty, because his name appeared most often with respect to neural networks. We met up for coffee and he seemed excited by my project but was in the throes of editing a book and didn’t have time to help. He sent an email to five of his colleagues and after a few nibbles I met up for another coffee with Lech Szymanski.

Lech had developed a basic RNN for teaching purposes last year, fed it Pride and Prejudice and it was able to spit out Austeny prose with the right prompts. Lech offered to set up a web interface on one of the university’s servers so I could access this RNN, upload my own corpus, select how to train it (more on this in a second), then play around with the outputs.

And in a matter of days I had the keys (read: password) to this amazing text generator. In terms of what can be achieved in language generation, it’s still highly basic. But it has proved an amazing sandpit in which to cut my teeth (and to generate some poetry for tonight’s event!).


Method statement

Step 1:  Collect the text (corpus) you’re going to feed into the RNN.

For me this involved about 10 hours of scouring the internet for all available song lyrics by Dunedin Sound bands up to about 1995. This was quite hit or miss. I got quite a lot of Chills, Bats, Verlaines, Clean and Straightjacket Fits. Others (eg The Gordons, Look Blue Go Purple) were much harder to come by, a reflection of the fact these lyrics sites are heavily weighted towards contemporary puff and hosted overseas. All up, I got about 300 songs (45,000 words). I’d hoped to have about double that, and maybe someone will read this and help me out. But I didn’t have time to transcribe lyrics from some of the great bands that were unfairly under-represented in my corpus, and thems the breaks.

A note about the size: the RNN basically learns to speak English from the text you upload and train it on. If a word isn’t in the corpus, it won’t know it. If a word is only in there once, the model will only know one what that word can occur. There’s an element of randomisation in word selection, but to get unique but genuine sounding outputs requires a huge and varied corpus and a really well calibrated model. It’s fair to say my first attempts were going to be sitting at the more cursory end of the spectrum.

Step 2: Upload the .txt file containing these lyrics to the RNN.

Step 3: Specify the training parameters for the new model:

  • Number of epochs – how many times the network goes over the entire training data during its training. The more epochs, the better at speaking this particular brand of English it’s going to be.
  • Number of hidden neurons – the size of the network. More neurons, more capacity to store patterns, but the longer it’ll take to train.
  • Predict from a sequence of X words – the network takes a fixed number of words in order to predict the next one.
For my first model, I went with the default (500 epochs, 128 hidden neurons, sequence of 5 words)

Step 4: Start training

And…

Waiting for a command...
Received command to train...
Error! Couldn't find dunedin...
Waiting for a command...
Received command to train...
Error! Couldn't find dunedin...
Waiting for a command...
Received command to train...
Error! Couldn't find dunedin...

Okay, so there were a few hiccups to start with (no spaces in .txt filenames, please).

When we got it going, the first model took around 12 hours to train, which meant I had a new toy to play with when I got into the office the next morning.

Generating text

Once you’ve trained your model, you need to give it some starting text that it will then follow on from, the number of word to generate, and the number of top most likely words to choose at random from.

Version 1

This first model, with 500 epochs of training and a fairly grunty 128 neurons, was so well schooled in the 300-ish song lyrics, that if you set the number of top most likely words to choose from at random around 5 or lower, it would very quickly start spitting out verbatim Verlaines lyrics or just saying soft bomb soft bomb soft bomb (Thanks, The Clean!) until it had dutifully fulfilled the order of words it needed to generate.

By turning up the randomness, it got a bit more interesting.

Very early on it became clear this exercise, while it might not produce the greatest poetry, it’s a really good way for humans (or me, at least) to synthesise a decent amount of text and look for patterns.

Because I was hoping to create a ‘found’ poem or two that reflected back on Dunedin, I was on the lookout for items of local colour in the generated text. But unlike the music videos, and that guitar jangle, there wasn’t a lot of quintessential Dunedin nounage in the corpus. ‘Dunedin’ never once came up. No suburbs (I don’t think). No ‘rugby’ or ‘Speights’ or ‘Mitre 10’. Which is telling, perhaps, in that this was counter-culture, even for Dunedin. The music videos on George St or St Clair beach weren’t so much parochial proclamations as the result of fiscal imperatives.

As I continued to tinker, I found the most glee in using Dunedin-themed seed words (eg ‘Welcome to Dunedin’, ‘Take Your Place In the World’), as a kind of antagonistic gesture to drawn these fiercely geo-generic lyrics into the service of overt local commentary.

But first I needed to tinker with the model.

Version 2

I got the RNN to retrain on the same corpus but only for 200 epochs, using 32 neurons and 5 word predictive sequence. The hope was that this would provide a model that would be prone to more random combinations of words, without producing entire gibberish.

And it did seem to work better, for my purposes at least.

In amongst the noise…

There's wonder whetherÊ, sweet
(Where bark enough
But(Why?,It’s hipper on once awake

…there were moments of almost poetry:

I'm burst around truth
 Tremble cos it's time for love
 We want a pain that grows
 But blind away our worries
 Nah surprise, mindless idiot

Poetry? Or a direct address to the mindless idiot fiddling with the knobs?

To generate proto-poems, I’d feed the model the same phrase three or four times, tell it to generate 40 or 50 words at different degrees of randomness, and tape them together as verses.

I’d then cut and paste the verbatim output and do a light edit, getting rid of the nonsense characters, adding spaces to awkwardly conjoined words, and fixing tenses.

Then I’d cut and paste that text and do a heavier edit, removing errant words or moving them within a line, changing pronouns to maintain the unity of the verse.

If it was going really well, I might do an even heavier edit, which would leave just the cream of the model’s poetic output.

Anyone who spends any time tinkering with methods of text generation for creative means is basically willing themselves into a state of extreme credulity. You have to take what you’re given and assign it meaning when the machine had no such intention. For me, there were two impulses, at times complimentary, at times competing. The text from the model was:
  1. The disembodied voice of Dunedin itself; and/or
  2. Autobiographical (i.e. the words Craig really wants to say, but didn’t know it until he pushed “Generate”
An example of a ‘poem’ using this second model with 3 rounds of editing from me:

Take your place in the world 
 
Take your place in the world
 Ride on the air, on sorrow, 
through the world and all you’re
 Keeping back
 You think I could pass the living touch to you?
 Done.
 
A plundering drive inside the water
 Look at this head,
it's really my cactus upon me
 
Take your place in the world, cross the blue
 And accept a light
tomorrow they came from today
take me up, love that's more than money,
just go, baby, without my sense.
 

Version 3

Ideally, I wouldn’t have to resort to anything but a light edit and let the model talk for itself. The biggest drawback seemed to be the limited corpus. If you could only learn to speak from 300 song lyrics, you’d struggle to be articulate no matter how many different ways you studied them.

But I’d exhausted the online well and wasn’t about to transcribe lyrics for 30 hours just to double the size of my corpus. So, in the spirit of experimentation, I added the Beatles songbook (30,000 words) to my Dunedin Sound lyrics. I chose the Beatles because they were a known quantity, the lyrics were readily available and have relatively little local flavor, and adding some more Lennon and McCartney songwriting DNA into the model could only be a good thing if the aim was to create more recognisable song lyric text.

And it worked pretty much as I’d hoped. There were hardly ever words that were obviously The Beatles' (begone Pam and Maxwell) and the dark and brooding nature of the previous models was still retained.

So I started feeding the model seeds of Dunedin and in return it gave me the raw materials for Dunedin-y poems. I still needed to edit, and the more I edited the better they became as poems (especially ones with the intent of reading aloud) — but of course I’d think that about my own editing. 

I don’t want to do much more analysis of the poems, they should speak for themselves, but I will add that, knowing what you now know about their genesis, it opens up a range of reading approachs that wouldn't normally exist, like: 

  1. Looking for identifiable words or phrases from Dunedin Sound or (Beatles) songs
  2. Looking for ‘the joins’ – where I might have wielded the heaviest hand in the editing for structural reasons
  3. Reading my palms – the search for autobiographical elements despite my efforts to create a distance between me and the text
  4. Dobbing Dunedin in – the search for the truth of Dunedin from the (mangled) mouths of its jean-clad prophets
So to close, I’ll leave you with two poems (I’m still not sure which I’ll read tonight – I’m the 6th of 7 poets, so I’ll pick which fits best on the fly), both generated using the third model (Dunedin Sound + Beatles lyrics, 300 epochs, 32 neurons, 10 word sequence). The titles are the seed text I used for each generative act.


Dunedin 
 
Dunedin,
you call you with your old trees
that time cannot bite: 
Welcome to Think Small.
 
Dunedin the one, it's just, 
 How was I doing? Was I right?
 You’re inside me and you've got a summer!
  Rivers will rise,
They'll question you:
 
North Dunedin, I spend my life leaving,
trapped, refrain.
Another Lady hard from any year, any horizon.
 The artist using her
 kaleidoscope 
knows to touch the heads of my bedroom 
 for silence.
 Yeah, she doesn't.
 
Dunedin, 
 If I can’t get a better wave of submarine aches,
 Sail and sunset and back Monday, 
that distant other lover’s boat slips in two:
 you we I me
 I can guess what they're about to say,
It's:
 
        Dunedin with the night darkness,
should overtime bring you round
 all the west commences brown
easy-pleaser sling stood still
There's the little madman’s papers-Yes
 He put on your love song,
 Pyromaniac,
 Take in darkness what we seem,
a bad danger.
 
Dunedin with a little hand
 everybody’s ticked.
She was bringing us apart.
Dunedin, don't prove ugly!


Welcome to Dunedin 
 
Welcome to Dunedin at dark.
 You don't force dreams. 
People see only the thread that ever dreamed
 And, for you,
 a star.
 
Welcome to Dunedin. 
Ain't they run out of naked sorrow gloom?
 Now you'll stuff shaking bright eyes where 
the wind he reaps soft passes 
of a blue town, and
 Baby, you're dead.
 What? Please. 
Now gone in confuséd linger.
 
Welcome to Dunedin!
 [Chorus: dance to question the mind]
 
Welcome to Dunedin, 
The impairing ocean is now blue. 
Say you would never never do the lie by land.
 The message turned the pot satin.
 A seized day hauled off. 
We belong that tell the prayer.
 
Welcome to Dunedin,
Old hallows joyride and no-one 
gazing beside me come snow. 
Trust that I'll get beauty,
That any chaos will.


-->
Roll credits

Thanks very much to Lech Syzmanski and Anthony Robins. Thanks also to Clare Mabey & the team at Dunedin Writers Festival for putting me on the Found Poetry session and starting me off on this path. I have a lot of links to other interesting stuff people are doing with neural networks, but I'll save that for another time.

2 comments:

Radha Sai said...
This comment has been removed by the author.
Radha Sai said...

nice post. Keep updating Artificial intelligence Online Trining