Enter an ArXiv ID, or paste a title/abstract.


ArXiv:2008.11649

Discrete Word Embedding for Logical Natural Language Understanding

Masataro Asai; Zilu Tang
2020-08-26T16:15:18Z
We propose an unsupervised neural model for learning a discrete embedding of words. Unlike existing discrete embeddings, our binary embedding supports vector arithmetic operations similar to continuous embeddings. Our embedding represents each word as a set of propositional statements describing a transition rule in classical/STRIPS planning formalism. This makes the embedding directly compatible with symbolic, state of the art classical planning solvers.

Results

The tabs below give you different ways of browsing related content.

Referees

This is a list of authors who have written similar peer-reviewed articles in the past.

Journals

These journals have published similar papers in the past.

Similar articles

These articles are similar to the above.