논문 읽기/3D Vision

[Paper Review] Set Transformer(2018), A Framework for Attention-based Permutation-Invariant Neural Networks

AI 꿈나무 2021. 7. 25. 22:03
반응형

Set Transformer, A Framework for Attention-based Permutation-Invariant Neural Networks

Juho Lee, Yoonho Lee, Jungtaek Kim arXiv 2018

 

PDF, Transformer By SeonghoonYu July 25th, 2021

 

Summary

 Set Transforemr is a function that performs permutation invariant by taking elements thar are ordered in the set. The model consists of an encoder and a decoder, both of which rely on attention. This model learn the relationships between k vectors by mapping n input set to k vectors. Namely, they presents n input sets as k elements which are permutation invariant

 

 

(1) Encoder

 

 The Encoder X -> Z $\in$ R^nxd is a stack of SABs or ISAMs. The encoder transforms data X(nxd) into features Z(nxd).

 

 Computation complexty of SAM is O(n^2), ISAB is O(nm). ISAM use inducing points(I) as query vector of ISAM.

 

 

 

 

(2) Decoder

 

 Decoder aggregates the output of encoder into a single of a set of vectors.

 

 Decoder use PMA which should be followed by SABs using seed vector to model the correlation between k outputs.

 

Experiment

 Accuracy on the unique character counting task.

 

 

Meta clustering results

 

 

Accuracy for the point cloud classification task

 

 

What I like about the paper

  • mapping n elements to k outputs by using attention mechanism
  • function which is permutation invariant

my github about what i read

 

Seonghoon-Yu/Paper_Review_and_Implementation_in_PyTorch

공부 목적으로 논문을 리뷰하고 해당 논문 파이토치 재구현을 합니다. Contribute to Seonghoon-Yu/Paper_Review_and_Implementation_in_PyTorch development by creating an account on GitHub.

github.com

 

반응형