Skip to content

Latest commit

 

History

History
113 lines (80 loc) · 3.66 KB

README.md

File metadata and controls

113 lines (80 loc) · 3.66 KB

verbphysics

Build Status license MIT

About

This repository contains the data and reference implementation for the paper

Verb Physics: Relative Physical Knowledge of Actions and Objects
Maxwell Forbes and Yejin Choi
ACL 2017

See the Verb Physics project page for more details (model visualiation, paper link, bibtex citation).

Installation

The code is written in Python 2.7. We recommend a fresh virtualenv.

# Install the required python libraries
pip install -r requirements.txt

# Install the locally-packaged `ngramdb` library (written by Li Zilles).
pip install lib/ngramdb/

# Download the data (cached ngramdb data; GloVe embeddings; trained factor
# weights; NLTK data).
./scripts/data.sh

Our Travis-CI script validates the above installation instructions by running them on a fresh machine after every code modification.

Running

By default, the code is setup to run a particular model from the paper (our model (A))

python -m src.main

You can view all of the default configurations by running with --help

python -m src.main --help
usage: main.py [-h] [--config CONFIG] [--poly POLY] [--viz]

verbphysics reference implementation

optional arguments:
  -h, --help       show this help message and exit
  --config CONFIG  hyperparameter configuration to use; options: model_a |
                   playing | model_b_objpairs | model_b_frames (default:
                   model_a
  --poly POLY      Whether to try polynomially-many hyperparameter config
                   combinations (True, default) or vary config dimension
                   sequentially (False).
  --viz            Whether to dump model / data to JSON for visualization
                   (default False).

Settings (hyperparameter) configurations are found in src/settings.py. You can modify the playing dictionary found in src/main.py with your own configuration and run the custom model using --config=playing.

Data

The verbphysics data is found under data/verbphysics/.

Task setup as in the ACL 2017 paper

When predicting action frames, only 5% action frame data should be used. Either 5% (our model A) or 20% object pair data (our model B) may be used to assist in action frame prediction.

When predicting object pairs, only 5% object pair data should be used. Either 5% (our model A) or 20% action frame data (our model B) may be used to assist in object pair prediction.

Attribute names in code

For legacy reasons, the code has different names for some attributes. The actual data (i.e., the questions asked to Mechanical Turk workers) use the attributes reported in the paper.

attribute name in code
size size
weight weight
strength hardness
rigidness rigidness
speed verb-speed

Visualization

You can use factorgraph-viz to visualize verbphysics factor graph models interactively in your web browser. To produce visualization data, add the command line argument --viz.

The Verb Physics project page has a live demo of this running.

An example rendering of a factor graph using the factorgraph-viz library

See also

The py-factorgraph library provides the underlying factor graph implementation.