Skip to content
forked from leod/hncynic

Generate Hacker News Comments from Titles

License

Notifications You must be signed in to change notification settings

hominluo/hncynic

 
 

Repository files navigation

hncynic

The best Hacker News comments are written with a complete disregard for the linked article. hncynic is an attempt at capturing this phenomenon by training a model to predict Hacker News comments just from the submission title. More specifically, I trained a Transformer encoder-decoder model on Hacker News data. In my second attempt, I also included data from Wikipedia.

The generated comments are fun to read, but often turn out meaningless or contradictory -- see here for some examples generated from recent HN titles.

There is a demo live at https://hncynic.leod.org/.

Steps

Hacker News

Train a model on Hacker News data only:

  1. data: Prepare the data and extract title-comment pairs from the HN data dump.
  2. train: Train a Transformer translation model on the title-comment pairs using TensorFlow and OpenNMT-tf.

Transfer Learning

Train a model on Wikipedia data, then switch to Hacker News data:

  1. data-wiki: Prepare data from Wikipedia articles.
  2. train-wiki: Train a model to predict Wikipedia section texts from titles.
  3. train-wiki-hn: Continue training on HN data.

Hosting

  1. serve: Serve the model with TensorFlow serving.
  2. ui: Host a web interface for querying the model.

Future Work

  • Acquire GCP credits, train for more steps.
  • It's probably nonideal to use encoder-decoder models. In retrospect, I should have trained a language model instead, on data like title <SEP> comment.
  • I've completely excluded HN comments that are replies from the training data. It might be interesting to train on these as well.

About

Generate Hacker News Comments from Titles

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 73.2%
  • HTML 13.7%
  • Shell 5.6%
  • Gnuplot 5.5%
  • Awk 2.0%