Story planning: unlimiting creativity

The plot (or narrative) of a story has some similarity with a plan, as it is usually defined in classical AI planning. This has led researchers to investigate planning-based approaches to automatic story generation (see, for example, the work of Porteous et al. and their demo. However, the planning approach requires as input a formal model of the “world”; in which the story is to take place - its characters, places, objects, their relations, and the things they can do. Writing this world model is a great burden on system designers, and limits the creativity that story planning systems can exhibit. On the other hand, machine learning approaches can produce generative models from data in a mostly unsupervised manner, but these so far are only able to replicate the surface appearance of fiction and frequently fail to generate a coherent plot (sunspring is a classic example).

The ultimate goal of automated story generation is to release unbounded creativity while still making sense. One way to approach this problem is to look to the already large, and growing, collections of structured semantic knowledge that is available on the web. Initiatives such as UVI, conceptnet and read the web offer large databases of facts and relations (both abstract and concrete) obtained from text mining and crowdsourcing, while ontologies can provide knowledge about specific areas, or general things like time (see, for example, vocab.org). However, a type of sematic knowledge that is crucial to narratives and that is currently not easily available is knowledge about events (such as what causes them to happen, what their consequences are, and how they chain together).

Realising the vision of unlimited narrative creativity is a great task, which cannot be completed within a single student project. Below are some current specific topics within the scope of this project. One of these can form the basis for an individual research, honours or masters project, while a PhD will span multiple topics and go beyond these starting points.

Topic: Time line extraction

A first step towards understanding the plot of a text is to identify the events that it describes and order them into a consistent time line. Extracting ordered event sequences is also a prerequisite for applying some event model learning methods.

Students interested in this topic should read the recent paper by Ruiqi Li, the honours thesis by William James and the report by Pandu Kerr.

Topic: Event model learning from traces

Event (or action) models capture knowledge about possible events, such as their participants, prerequisites, consequences, and relations to other events, and are a key ingredient in a planning approach to narrative generation. Learning action models from observations of plans or plan executions (known as “traces”) has been studied in AI planning. However, event traces extracted from text raise many challenges that have not been considered: events can have complex arguments, traces are partial and uncertain, and the order of events can often only be partially inferred.

Students interested in this topic should consult the MACQ interactive literature collection for an overview of approaches to model learning in planning.

Topic: Event model learning from text mining

Another approach to learning event/action models is to search for associations using text mining. This was proposed by Sil, Huang and Yates (2010, 2011), but not tried at scale. A similar approach was proposed by Tandon and others (2015, data).

Students interested in this topic should read the project report by Louis Carlin and the papers cited above.

Topic: Compiling event models from knowledge sources

As mentioned above, there are already many resources on the web that provide concepts, facts and relations, such as UVI or conceptnet, as well as hand-crafted models for story generation (e.g., Veale’s, and the fabulous book by William Wallace Cook), that could be combined to build, extend or refine event models.

Students interested in this topic should read the project report by Musha Wen.

Background reading

  • Follow the links!
  • Haslum, “Narrative Planning: Compilations to Classical Planning”, Journal of AI Research, vol. 44, p. 383-395, 2012, also available on-line from JAIR.
  • Julie Porteous, Fred Charles and Marc Cavazza, “NetworkING: using Character Relationships for Interactive Narrative Generation”, International Conference on Autonomous Agents and Multiagent Systems, 2013 (available here).

Previous students’ work on this topic:

  • David Cowley carried out a user study in interactive fiction and player modelling (honours thesis, 2015).
  • Emily Rodrigo implemented a Markov chain model for text generation, and explored the question if coupling it with a topic model could generate text that was more coherent and sensible (report, 2017).
  • William James (co-supervised with Hanna Suominen) implemented the beginnings of an NLP pipeline for extracting action/event models from text (honours thesis, 2017).
  • Musha Wen investigated the use of the conceptnet database, combined with NLP extraction techniques, to find event relations that may be used to create story planning operators (report, 2018).
  • Louis Carlin implemented and evaluated another technique for extracting event models from text (report, 2019).
  • Jiaqi Zhang (report, 2019) and Debashish Chakraborty (thesis, 2019) both studied the shortcomings of GAN-based text generation.

Requirements

To undertake substantial work on this project, students must have a solid background in computer science and very good programming skills. Some simpler topics may be attainable for students with superficial knowledge, for example of machine learning.

Contacts

You are on Aboriginal land.

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.

arrow-left bars search times