Set Transformer, A Framework for Attention-based Permutation-Invariant Neural Networks
Juho Lee, Yoonho Lee, Jungtaek Kim arXiv 2018
PDF, Transformer By SeonghoonYu July 25th, 2021
Summary
Set Transforemr is a function that performs permutation invariant by taking elements thar are ordered in the set. The model consists of an encoder and a decoder, both of which rely on attention. This model learn the relationships between k vectors by mapping n input set to k vectors. Namely, they presents n input sets as k elements which are permutation invariant
(1) Encoder
The Encoder X -> Z $\in$ R^nxd is a stack of SABs or ISAMs. The encoder transforms data X(nxd) into features Z(nxd).
Computation complexty of SAM is O(n^2), ISAB is O(nm). ISAM use inducing points(I) as query vector of ISAM.
(2) Decoder
Decoder aggregates the output of encoder into a single of a set of vectors.
Decoder use PMA which should be followed by SABs using seed vector to model the correlation between k outputs.
Experiment
Accuracy on the unique character counting task.
Meta clustering results
Accuracy for the point cloud classification task
What I like about the paper
- mapping n elements to k outputs by using attention mechanism
- function which is permutation invariant
my github about what i read