Difference between revisions of "Fojiba-Jabba Notes"

From SlugWiki
Jump to: navigation, search
(Possible Solutions)
Line 36: Line 36:
 
*Add more statistical or linguistic knowledge.
 
*Add more statistical or linguistic knowledge.
 
*Create a set of rules by hand.
 
*Create a set of rules by hand.
 +
 +
=To Do=
 +
*Work on Smoke Alarm.
 +
*Eh.

Revision as of 21:37, 10 June 2006

Fojiba-Jabba is the module of Cruft Alarm supporting Automatic Text Generation.

Theoretical Foundations

Fojiba-Jabba uses techniques from Markov Chain- and Recursive Transition Network-Theory.

Markov Chains

One method of text generation involves Markov Chains. In theory, Markov Chains can produce a delightfully quirky text; in practice, they sort of suck.

Process

The process can be summarized as follows:

  • The user specifies an initial word and the number of sentences desired in the text.
  • Fojiba-Jabba, having previously analyzed a set of texts in order to gather statistics on which words follow which words, uses these data to generate the next word.
  • This process repeats until the desired number of sentences is obtained.

Problems

There are, however, several problems with this method:

  • The corpus available is too limited to attempt anything but an Order-1 Markov Chain (anything higher results in what is essentially the original text itself).
  • An Order-1 Markov Chain is often too retarded to produce anything but rather ungrammatical (and clearly fake) sentences.

Possible Solutions

  • Use highly advanced linguistic knowledge to improve grammaticality (e.g., a noun or an adjective must follow a determiner). A Brill Part-of-Speech Tagger or the Stanford Parser may be useful here.
  • Use google to find likely following words, or to increase the dataset somehow.

Recursive Transition Networks

Another method of text generation involves Recursive Transition Networks. While more grammatical than Markov Chains by design (in theory; in practice, they also sort of suck), they are slightly more difficult to implement and, unless cleverly manufactured, have less of the idiosyncratic charm that Markov Chain-fans find so endearing.

Process

Text generation using Recursive Transition Networks proceeds as follows:

  • Fojiba-Jabba takes a set of texts, and runs a part-of-speech tagger through them. Two hash maps are created: one mapping each word to its possible parts-of-speech (e.g., 'can' to 'noun' and 'modal') and one mapping each part-of-speech to its possible words (e.g., 'verb' to 'lick' and 'frolic'). A rules array containing every sequence of part-of-speech tags that any sentence contains is also created.
  • To generate a sentence, Fojiba-Jabba picks a random element (a sequence of part-of-speech tags) from the rules array. It then maps each of these tags to an English word using its hash map, thereby generating a sentence.

Problems

  • Parts-of-speech alone are not enough to guarantee grammaticality, so the sentences produced by the RTN are possibly even more ungrammatical than those produced by the Markov Chains.

Possible Solutions

  • Add more statistical or linguistic knowledge.
  • Create a set of rules by hand.

To Do

  • Work on Smoke Alarm.
  • Eh.