Thanks for joining our Generative AI, refresher!
​
-
Lately we’ve been asked a lot about the AI we use and whether it’s Generative-AI.
-
In response, we will, with your permission, help you take the first steps in becoming an Historical Romance writer with some help from real, Generative AI.
-
To start we'll need a database of all Historical Romance that’s been written in the last couple of centuries. Amazingly, it’s possible to buy one if you know where to look (contact us, if you don't). There’s about 2 billion words there including around 50 million belonging to Barbara Cartland, the 20th century’s most successful Historical Romance writer.
-
We now need a compact version of chat-GPT, more suited to a home PC. Amazingly, there are several! We’ll choose one from the AI-Lab at MIT called Baby Bot (BB).
-
And off we go. BB is loaded into our PC and that 2 billion word database is there too. They’ll spend the next couple of hours getting to know each-other. At first, BB just produces random letters, later words appear after around an hour and finally after a couple of hours short sentences begin to emerge. BB has been linking all those words together in a neural network that’s wired a little like the human brain.
-
Now that BB has a database and is able to produce short sentences, she’s no longer a Bot, she’s a wholly capable generative AI system. Generative because she can generate new content from the existing stuff she has in her database.
-
Now, let’s ask her to help with finding a title for our new soon-to-be best seller. We need to give her a prompt to start, so we’ll try: We can’t go on.
-
Now, the job of BB and every other generative AI bot on the Planet is simply to predict the next word (or two) in the current sentence. Nothing complicated: it’s as straightforward as. That’s what generative AI does: predict the next number or word.
-
Behind the scenes, BB is comparing a list of potential next words in the title with the millions of other next possible words from the database by assigning a numerical value to each one. She then subtracts the potential next word value from each database contender’s value and squares the difference to produce the “Loss” and the paring with the lowest Loss gets the Guernsey. She does this for the millions of possibilities and within a few seconds gives us: like this.
-
Now we have our best seller’s nascent title: We can’t go on like this. Even though it recognisably belongs to the canon of Romantic Fiction, it's lacking something. Maybe it’s time to consult Dame Babbs and ask BB to use Barbara’s part of the database to provide that extra boost to our title. After a couple of seconds Babbs has a ripper of a suggestion: Algernon.
-
Yes. Our title: We can’t go on like this, Algernon portends a real winner. With input from you, alongside a 2 Billion word database and with Dame Babbs herself contributing, it’ll be ascending the NY Times best seller list before you can say Normalised Loss Function. Easy as…….
-
Unfortunately, in looking at our title, we quickly realise that it's stupidly, gobbsmackingly, mindblowingly daft. As humans we realised this because we have a sense of context. BB does not and Gen-AI will take some time to acquire this, or something resembling it, if at all.
-
BB has though, demonstrated at least one human characteristic: Algernon really was quite unexpected; a rabbit pulled out of a hat, just as humans might. It's this disjunctive quality, that makes Gen-AI appear so hauntingly and frighteningly human.
-
Generally: Gen-AI works best with words and Predictive AI with numbers. Only about 10% of Gen-AI Apps actually use numbers so Predictive AI is where we should be looking for Health-AI's Apps.
Barbara Cartland in 1925
Doyen of 20th Century Historical Romance writers, Dame Barbara Cartland (1901 - 2000) contributed over 50 million words to the genre, at least half of them dictated to her secretary whilst lying on a Chaise Longe.