Enter an ArXiv ID, or paste a title/abstract.


Discrete Word Embedding for Logical Natural Language Understanding

Masataro Asai; Zilu Tang
We propose an unsupervised neural model for learning a discrete embedding of words. Unlike existing discrete embeddings, our binary embedding supports vector arithmetic operations similar to continuous embeddings. Our embedding represents each word as a set of propositional statements describing a transition rule in classical/STRIPS planning formalism. This makes the embedding directly compatible with symbolic, state of the art classical planning solvers.


The tabs below give you different ways of browsing related content.


This is a list of authors who have written similar peer-reviewed articles in the past.


These journals have published similar papers in the past.

Similar articles

These articles are similar to the above.