We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Recursive tree grammar autoencoders.
- Authors
Paaßen, Benjamin; Koprinska, Irena; Yacef, Kalina
- Abstract
Machine learning on trees has been mostly focused on trees as input. Much less research has investigated trees as output, which has many applications, such as molecule optimization for drug discovery, or hint generation for intelligent tutoring systems. In this work, we propose a novel autoencoder approach, called recursive tree grammar autoencoder (RTG-AE), which encodes trees via a bottom-up parser and decodes trees via a tree grammar, both learned via recursive neural networks that minimize the variational autoencoder loss. The resulting encoder and decoder can then be utilized in subsequent tasks, such as optimization and time series prediction. RTG-AEs are the first model to combine three features: recursive processing, grammatical knowledge, and deep learning. Our key message is that this unique combination of all three features outperforms models which combine any two of the three. Experimentally, we show that RTG-AE improves the autoencoding error, training time, and optimization score on synthetic as well as real datasets compared to four baselines. We further prove that RTG-AEs parse and generate trees in linear time and are expressive enough to handle all regular tree grammars.
- Subjects
INTELLIGENT tutoring systems; DRUG discovery; GRAMMAR; DEEP learning
- Publication
Machine Learning, 2022, Vol 111, Issue 9, p3393
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-022-06223-7