uniformer-pytorchImplementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (-17.43%)
Mutual labels: transformers, attention-mechanism, video-classification
RETRO-pytorchImplementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Stars: ✭ 473 (+333.94%)
Mutual labels: transformers, attention-mechanism
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (+1408.26%)
Mutual labels: transformers, attention-mechanism
keras-deep-learningVarious implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-79.82%)
Mutual labels: attention-mechanism, video-classification
long-short-transformerImplementation of Long-Short Transformer, combining local and global inductive biases for attention over long sequences, in Pytorch
Stars: ✭ 103 (-5.5%)
Mutual labels: transformers, attention-mechanism
nuwa-pytorchImplementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+218.35%)
Mutual labels: transformers, attention-mechanism
transganformerImplementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (+25.69%)
Mutual labels: transformers, attention-mechanism
Vit PytorchImplementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
Stars: ✭ 7,199 (+6504.59%)
Mutual labels: transformers, attention-mechanism
Dalle PytorchImplementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Stars: ✭ 3,661 (+3258.72%)
Mutual labels: transformers, attention-mechanism
DARNNA Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Stars: ✭ 90 (-17.43%)
Mutual labels: attention-mechanism
SnowflakeNet(TPAMI 2022) Snowflake Point Deconvolution for Point Cloud Completion and Generation with Skip-Transformer
Stars: ✭ 74 (-32.11%)
Mutual labels: transformers
COCO-LM[NeurIPS 2021] COCO-LM: Correcting and Contrasting Text Sequences for Language Model Pretraining
Stars: ✭ 109 (+0%)
Mutual labels: transformers
KoBERT-TransformersKoBERT on 🤗 Huggingface Transformers 🤗 (with Bug Fixed)
Stars: ✭ 162 (+48.62%)
Mutual labels: transformers
thermostatCollection of NLP model explanations and accompanying analysis tools
Stars: ✭ 126 (+15.6%)
Mutual labels: transformers
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (-20.18%)
Mutual labels: attention-mechanism
naruNeural Relation Understanding: neural cardinality estimators for tabular data
Stars: ✭ 76 (-30.28%)
Mutual labels: transformers
nlp-papersMust-read papers on Natural Language Processing (NLP)
Stars: ✭ 87 (-20.18%)
Mutual labels: transformers
TA3N[ICCV 2019 Oral] TA3N: https://github.com/cmhungsteve/TA3N (Most updated repo)
Stars: ✭ 45 (-58.72%)
Mutual labels: video-classification
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-85.32%)
Mutual labels: attention-mechanism