The FacetedDBLP logo    Search for: in:

Disable automatic phrases ?     Syntactic query expansion: ?

Searching for attention with no syntactic query expansion in all metadata.

Publication years (Num. hits)
1959-1968 (16) 1969-1975 (24) 1976-1977 (15) 1978-1980 (26) 1981-1982 (16) 1983-1984 (16) 1985-1986 (30) 1987 (30) 1988 (50) 1989 (46) 1990 (60) 1991 (37) 1992 (48) 1993 (56) 1994 (70) 1995 (99) 1996 (137) 1997 (163) 1998 (182) 1999 (309) 2000 (373) 2001 (461) 2002 (578) 2003 (752) 2004 (1043) 2005 (1320) 2006 (1510) 2007 (1819) 2008 (1794) 2009 (1337) 2010 (481) 2011 (304) 2012 (313) 2013 (368) 2014 (331) 2015 (394) 2016 (582) 2017 (955) 2018 (1733) 2019 (3186) 2020 (4271) 2021 (5621) 2022 (6847) 2023 (8013) 2024 (2058)
Publication types (Num. hits)
article(22866) book(3) data(2) incollection(128) inproceedings(24715) phdthesis(156) proceedings(4)
Venues (Conferences, Journals, ...)
CoRR(6921) IEEE Access(1116) Sensors(542) ICASSP(509) Neurocomputing(473) Remote. Sens.(456) IJCNN(414) NeuroImage(375) AAAI(350) IEEE Trans. Geosci. Remote. Se...(345) CVPR(340) Multim. Tools Appl.(316) ICIP(306) Appl. Intell.(276) Interspeech(272) CogSci(261) More (+10 of total 4647)
GrowBag graphs for keyword ? (Num. hits/coverage)

Group by:
The graphs summarize 13907 occurrences of 6353 keywords

Results
Found 47874 publication records. Showing 47874 according to the selection in the facets
Hits ? Authors Title Venue Year Link Author keywords
10Seong-A. Park, Hyung-Chul Lee, Chul-Woo Jung, Hyun-Lim Yang Attention mechanisms for physiological signal deep learning: which attention should we take? Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Wei Dai, Rui Liu, Tianyi Wu, Min Wang 0032, Jianqin Yin, Jun Liu 0007 HierAttn: Effectively Learn Representations from Stage Attention and Branch Attention for Skin Lesions Diagnosis. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Nghia Hieu Nguyen, Duong T. D. Vo, Minh-Quan Ha VieCap4H-VLSP 2021: ObjectAoA - Enhancing performance of Object Relation Transformer with Attention on Attention for Vietnamese image captioning. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Xiaofeng Zhang 0004, Yikang Shen, Zeyu Huang, Jie Zhou, Wenge Rong, Zhang Xiong 0001 Mixture of Attention Heads: Selecting Attention Heads Per Token. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Haoran You, Yunyang Xiong, Xiaoliang Dai, Bichen Wu, Peizhao Zhang, Haoqi Fan 0004, Peter Vajda, Yingyan Lin Castling-ViT: Compressing Self-Attention via Switching Towards Linear-Angular Attention During Vision Transformer Inference. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Chandra Kanth Nagesh, Abhishek Purushothama The Birds Need Attention Too: Analysing usage of Self Attention in identifying bird calls in soundscapes. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Hongzhi Zhu, Septimiu Salcudean, Robert Rohling Gaze-Guided Class Activation Mapping: Leveraging Human Attention for Network Attention in Chest X-rays Classification. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
10Guochen Yu, Andong Li, Hui Wang 0070, Yutian Wang, Yuxuan Ke, Chengshi Zheng DBT-Net: Dual-branch federative magnitude and phase estimation with attention-in-attention transformer for monaural speech enhancement. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
10Giulio Lovisotto, Nicole Finnie, Mauricio Munoz, Chaithanya Kumar Mummadi, Jan Hendrik Metzen Give Me Your Attention: Dot-Product Attention Considered Harmful for Adversarial Patch Robustness. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yang Zhang 0058, Qiang Ma 0001 Dual Attention Model for Citation Recommendation with Analyses on Explainability of Attention Mechanisms and Qualitative Experiments. Search on Bibsonomy Comput. Linguistics The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Pengfei Fang, Jieming Zhou, Soumava Kumar Roy, Pan Ji, Lars Petersson, Mehrtash Harandi Attention in Attention Networks for Person Retrieval. Search on Bibsonomy IEEE Trans. Pattern Anal. Mach. Intell. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Weijiang Li, Qing Wang, Jiekun Wu, Zhengtao Yu 0001 Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction. Search on Bibsonomy Appl. Intell. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Enmin Lu, Xiaoxiao Hu Image super-resolution via channel attention and spatial attention. Search on Bibsonomy Appl. Intell. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Enze Su, Siqi Cai, Longhan Xie, Haizhou Li 0001, Tanja Schultz STAnet: A Spatiotemporal Attention Network for Decoding Auditory Spatial Attention From EEG. Search on Bibsonomy IEEE Trans. Biomed. Eng. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Ye Jin, Mei Wang, Liyan Luo, Dinghao Zhao, Zhanqi Liu Polyphonic Sound Event Detection Using Temporal-Frequency Attention and Feature Space Attention. Search on Bibsonomy Sensors The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Sanghun Jeon, Mun Sang Kim End-to-End Sentence-Level Multi-View Lipreading Architecture with Spatial Attention Module Integrated Multiple CNNs and Cascaded Local Self-Attention-CTC. Search on Bibsonomy Sensors The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yibo Feng, Xu Yang, Dawei Qiu, Huan Zhang, Dejian Wei, Jing Liu PCXRNet: Pneumonia Diagnosis From Chest X-Ray Images Using Condense Attention Block and Multiconvolution Attention Block. Search on Bibsonomy IEEE J. Biomed. Health Informatics The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Lars Rieser, Brent Furneaux Share of Attention: Exploring the Allocation of User Attention to Consumer Applications. Search on Bibsonomy Comput. Hum. Behav. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Guochen Yu, Andong Li, Hui Wang 0070, Yutian Wang, Yuxuan Ke, Chengshi Zheng DBT-Net: Dual-Branch Federative Magnitude and Phase Estimation With Attention-in-Attention Transformer for Monaural Speech Enhancement. Search on Bibsonomy IEEE ACM Trans. Audio Speech Lang. Process. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Ziqin Ling, Guihua Tao, Yang Li 0172, Hongmin Cai NPCFORMER: Automatic Nasopharyngeal Carcinoma Segmentation Based on Boundary Attention and Global Position Context Attention. Search on Bibsonomy ICIP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Hongzhi Zhu, Septimiu Salcudean, Robert Rohling Gaze-Guided Class Activation Mapping: Leverage Human Visual Attention for Network Attention in Chest X-rays Classification. Search on Bibsonomy VINCI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Julian Gutheil, Klaus Donsa SAINTENS: Self-Attention and Intersample Attention Transformer for Digital Biomarker Development Using Tabular Healthcare Real World Data. Search on Bibsonomy dHealth The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Dinh-Hieu Hoang, Gia-Han Diep, Minh-Triet Tran, Ngan T. H. Le DAM-AL: dilated attention mechanism with attention loss for 3D infant brain image segmentation. Search on Bibsonomy SAC The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Michael Yeung, Leonardo Rundo, Evis Sala, Carola-Bibiane Schönlieb, Guang Yang 0006 Focal Attention Networks: Optimising Attention for Biomedical Image Segmentation. Search on Bibsonomy ISBI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Takafumi Moriya, Takanori Ashihara, Atsushi Ando, Hiroshi Sato, Tomohiro Tanaka, Kohei Matsuura, Ryo Masumura, Marc Delcroix, Takahiro Shinozaki Hybrid RNN-T/Attention-Based Streaming ASR with Triggered Chunkwise Attention and Dual Internal Language Model Integration. Search on Bibsonomy ICASSP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Guochen Yu, Andong Li, Chengshi Zheng, Yinuo Guo, Yutian Wang, Hui Wang 0070 Dual-Branch Attention-In-Attention Transformer for Single-Channel Speech Enhancement. Search on Bibsonomy ICASSP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Cheng Zhuang, Yunlian Sun Ganet: Unary Attention Reaches Pairwise Attention Via Implicit Group Clustering in Light-Weight CNNs. Search on Bibsonomy ICASSP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Vandana Rajan, Alessio Brutti, Andrea Cavallaro Is Cross-Attention Preferable to Self-Attention for Multi-Modal Emotion Recognition? Search on Bibsonomy ICASSP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Saket S. Chaturvedi, Lan Zhang 0005, Xiaoyong Yuan Pay "Attention" to Adverse Weather: Weather-aware Attention-based Object Detection. Search on Bibsonomy ICPR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Jiangang Bai, Yujing Wang, Hong Sun, Ruonan Wu, Tianmeng Yang, Pengfei Tang, Defu Cao, Mingliang Zhang, Yunhai Tong, Yaming Yang 0001, Jing Bai 0010, Ruofei Zhang, Hao Sun 0015, Wei Shen Enhancing Self-Attention with Knowledge-Assisted Attention Maps. Search on Bibsonomy NAACL-HLT The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yoon Lee, Haoyu Chen 0001, Guoying Zhao 0001, Marcus Specht WEDAR: Webcam-based Attention Analysis via Attention Regulator Behavior Recognition with a Novel E-reading Dataset. Search on Bibsonomy ICMI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Le Zhang, Qi Feng, Yao Lu, Chang Liu, Guangming Lu AIA: Attention in Attention Within Collaborate Domains. Search on Bibsonomy PRCV (1) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Ye Huang, Di Kang, Wenjing Jia, Liu Liu, Xiangjian He Channelized Axial Attention - considering Channel Relation within Spatial Attention for Semantic Segmentation. Search on Bibsonomy AAAI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Duc Hau Nguyen, Guillaume Gravier, Pascale Sébillot Filtrage et régularisation pour améliorer la plausibilité des poids d'attention dans la tâche d'inférence en langue naturelle (Filtering and regularization to improve the plausibility of attention weights in NLI). Search on Bibsonomy TALN-RECITAL The full citation details ... 2022 DBLP  BibTeX  RDF
10Adrien Bibal, Rémi Cardon, David Alfter, Rodrigo Wilkens, Xiaoou Wang, Thomas François, Patrick Watrin L'Attention est-elle de l'Explication ? Une Introduction au Débat (Is Attention Explanation ? An Introduction to the Debate ). Search on Bibsonomy TALN-RECITAL The full citation details ... 2022 DBLP  BibTeX  RDF
10Kuan-Hsien Liu, Ching-Hsiang Chiu, Tsung-Jung Liu Fusion of Triple Attention to Residual in Residual Dense Block to Attention Based CNN for Facial Expression Recognition. Search on Bibsonomy SMC The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Javier Laplaza, Francesc Moreno-Noguer, Alberto Sanfeliu Context Attention: Human Motion Prediction Using Context Information and Deep Learning Attention Models. Search on Bibsonomy ROBOT (1) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Andrew Sanders, Bradley Boswell, Andrew A. Allen, Gursimran Singh Walia, Md Shakil Hossain Development and Field-Testing of a Non-intrusive Classroom Attention Tracking System (NiCATS) for Tracking Student Attention in CS Classrooms. Search on Bibsonomy FIE The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Giulio Lovisotto, Nicole Finnie, Mauricio Munoz, Chaithanya Kumar Mummadi, Jan Hendrik Metzen Give Me Your Attention: Dot-Product Attention Considered Harmful for Adversarial Patch Robustness. Search on Bibsonomy CVPR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yingda Fan, Amrinder Arora Attention Constraint Mechanism through Auxiliary Attention. Search on Bibsonomy ICTAI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Xiaohang Liu, Jingfei Jiang, Jinwei Xu, Lei Gao Evaluating a New Attention Framework Based on Matrix Blocking for Attention Models on FPGAs. Search on Bibsonomy ICTAI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Xi Qi, Lihua Tian, Chen Li 0033, Hui Song, Jiahui Yan Singing Melody Extraction Based on Combined Frequency-Temporal Attention and Attentional Feature Fusion with Self-Attention. Search on Bibsonomy ISM The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Seong-A. Park, Hyung-Chul Lee, Chul-Woo Jung, Hyun-Lim Yang Attention Mechanisms for Physiological Signal Deep Learning: Which Attention Should We Take? Search on Bibsonomy MICCAI (1) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Michael Hassid, Hao Peng 0009, Daniel Rotem, Jungo Kasai, Ivan Montero, Noah A. Smith, Roy Schwartz 0001 How Much Does Attention Actually Attend? Questioning the Importance of Attention in Pretrained Transformers. Search on Bibsonomy EMNLP (Findings) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Xiaofeng Zhang 0004, Yikang Shen, Zeyu Huang, Jie Zhou, Wenge Rong, Zhang Xiong 0001 Mixture of Attention Heads: Selecting Attention Heads Per Token. Search on Bibsonomy EMNLP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yacine Gaci, Boualem Benatallah, Fabio Casati, Khalid Benabdeslem Debiasing Pretrained Text Encoders by Paying Attention to Paying Attention. Search on Bibsonomy EMNLP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Qizao Wang, Xuelin Qian, Yanwei Fu, Xiangyang Xue Co-attention Aligned Mutual Cross-Attention for Cloth-Changing Person Re-identification. Search on Bibsonomy ACCV (5) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Rafael Pedro, Arlindo L. Oliveira Assessing the Impact of Attention and Self-Attention Mechanisms on the Classification of Skin Lesions. Search on Bibsonomy IJCNN The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Haotian Zhang, Wei Wu 0032 Transformer Gate Attention Model: An Improved Attention Model for Visual Question Answering. Search on Bibsonomy IJCNN The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Wentao Fu, Jifeng Sun, Yong Jiang 0001 Stock Selection via Expand-excite Conv Attention Autoencoder and Layer Sparse Attention Transformer: A Classification Approach Inspire Time Series Sequence Recognition. Search on Bibsonomy IJCNN The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Haoyu Liu, Yang Liu, Xingrui Wang, Hanfang Yang Exploring Coarse-grained Pre-guided Attention to Assist Fine-grained Attention Reinforcement Learning Agents. Search on Bibsonomy IJCNN The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Jiayi Zeng, Wei Liu 0004, Ying Zhou, Xinyu Li, Wenqing Cheng Attention Cloud Map: Visualizing Attention Migration of Students' Debugging Process in Semantic Code Space. Search on Bibsonomy ICCSE (3) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Chull Hwan Song, Hye Joo Han, Yannis Avrithis All the attention you need: Global-local, spatial-channel attention for image retrieval. Search on Bibsonomy WACV The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Daniel Bolya, Cheng-Yang Fu, Xiaoliang Dai, Peizhao Zhang, Judy Hoffman Hydra Attention: Efficient Attention with Many Heads. Search on Bibsonomy ECCV Workshops (7) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Shenyuan Gao, Chunluan Zhou, Chao Ma, Xinggang Wang, Junsong Yuan AiATrack: Attention in Attention for Transformer Visual Tracking. Search on Bibsonomy ECCV (22) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Mingyu Kim, Kyeongryeol Go, Se-Young Yun Neural Processes with Stochastic Attention: Paying more attention to the context dataset. Search on Bibsonomy ICLR The full citation details ... 2022 DBLP  BibTeX  RDF
10Si Shi, Wuman Luo, Rita Tse, Giovanni Pau 0001 Attention-LGBM-BiLSTM: An Attention-Based Ensemble Method for Knowledge Tracing. Search on Bibsonomy TALE The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Caie Xu, Jin Gan, Mingyang Wu, Dandan Ni AU-GAN: Attention U-Net Based on a Built-In Attention for Multi-domain Image-to-Image Translation. Search on Bibsonomy APWeb/WAIM Workshops The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Yichong Xu, Chenguang Zhu 0001, Shuohang Wang, Siqi Sun, Hao Cheng 0002, Xiaodong Liu 0003, Jianfeng Gao 0001, Pengcheng He, Michael Zeng 0001, Xuedong Huang 0001 Human Parity on CommonsenseQA: Augmenting Self-Attention with External Attention. Search on Bibsonomy IJCAI The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Naitik Bhise, Adam Krzyzak, Tien D. Bui Refining AttnGAN Using Attention on Attention Network. Search on Bibsonomy S+SSPR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Chentianye Xu Integrate Spatial Information into Channel Attention via a Multi- Scale Convolutional Attention Module. Search on Bibsonomy CAIBDA The full citation details ... 2022 DBLP  BibTeX  RDF
10Dairui Liu, Derek Greene, Ruihai Dong A Novel Perspective to Look At Attention: Bi-level Attention-based Explainable Topic Modeling for News Classification. Search on Bibsonomy ACL (Findings) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Nikolai Ilinykh, Simon Dobnik Attention as Grounding: Exploring Textual and Cross-Modal Attention on Entities and Relations in Language-and-Vision Transformer. Search on Bibsonomy ACL (Findings) The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
10Zongwei Wu Depth Attention for Scene Understanding. (Module d'attention sensible à la profondeur pour l'analyse de scènes). Search on Bibsonomy 2022   RDF
10Tristan Gomez Attention spatiale artificielle pour des modèles profonds interprétables de qualité embryonnaire. (Artificial spatial attention for interpretable deep models of embryonic quality). Search on Bibsonomy 2022   RDF
10Victor Delvigne Studying Attention in Virtual Reality with Electroencephalography and Eye-Tracking. (Étude de l'attention en réalité virtuelle à l'aide de signaux électroencéphalographiques et du suivi du regard). Search on Bibsonomy 2022   RDF
10Gökhan Güney, Esra Kisacik, Canan Kalaycioglu, Gorkem Saygili Exploring the attention process differentiation of attention deficit hyperactivity disorder (ADHD) symptomatic adults using artificial intelligence on electroencephalography (EEG) signals. Search on Bibsonomy Turkish J. Electr. Eng. Comput. Sci. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Adu Asare Baffour, Zhen Qin 0002, Yong Wang, Zhiguang Qin, Kim-Kwang Raymond Choo Spatial self-attention network with self-attention distillation for fine-grained image recognition. Search on Bibsonomy J. Vis. Commun. Image Represent. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Zaifeng Shi, Cheng Sun, Qingjie Cao, Zhe Wang, Qiangqiang Fan Residual attention-based tracking-by-detection network with attention-driven data augmentation. Search on Bibsonomy J. Vis. Commun. Image Represent. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Sara Spadone, Miroslaw Wyczesany, Stefania Della Penna, Maurizio Corbetta, Paolo Capotosto Directed Flow of Beta Band Communication During Reorienting of Attention Within the Dorsal Attention Network. Search on Bibsonomy Brain Connect. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Yun Liu 0017, Xiaoming Zhang 0001, Qianyun Zhang, Chaozhuo Li, Feiran Huang, Xianghong Tang, Zhoujun Li 0001 Dual self-attention with co-attention networks for visual question answering. Search on Bibsonomy Pattern Recognit. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Megan T. deBettencourt, Stephanie D. Williams, Edward K. Vogel, Edward Awh Sustained Attention and Spatial Attention Distinctly Influence Long-term Memory Encoding. Search on Bibsonomy J. Cogn. Neurosci. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Shunsuke Kitada, Hitoshi Iyatomi Attention Meets Perturbations: Robust and Interpretable Attention With Adversarial Training. Search on Bibsonomy IEEE Access The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Chean Khim Toa, Kok-Swee Sim, Shing Chiang Tan Electroencephalogram-Based Attention Level Classification Using Convolution Attention Memory Neural Network. Search on Bibsonomy IEEE Access The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Mei Zhang, Huihui Su, Jinghua Wen Classification of flower image based on attention mechanism and multi-loss attention network. Search on Bibsonomy Comput. Commun. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Yanping Fu, Yun Liu 0001 CGSPN : cascading gated self-attention and phrase-attention network for sentence modeling. Search on Bibsonomy J. Intell. Inf. Syst. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Magdalena Biesialska, Katarzyna Biesialska, Henryk Rybinski Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis. Search on Bibsonomy J. Intell. Inf. Syst. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Yan Bin Ng, Basura Fernando Weakly supervised action segmentation with effective use of attention and self-attention. Search on Bibsonomy Comput. Vis. Image Underst. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Volkmar Paul Engerer 'Attention, attention, exploring minds acknowledge digital structure!' The shift to digital humanities has happened, so what should information scientists do in response? Search on Bibsonomy Digit. Scholarsh. Humanit. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Jiasheng Wang, Qiongxin Liu Distant supervised relation extraction with position feature attention and selective bag attention. Search on Bibsonomy Neurocomputing The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Zhixiong Nan, Jizhi Peng, Jingjing Jiang, Hui Chen, Ben Yang, Jingmin Xin, Nanning Zheng 0001 A joint object detection and semantic segmentation model with cross-attention and inner-attention mechanisms. Search on Bibsonomy Neurocomputing The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Joseph F. DeRose, Jiayao Wang, Matthew Berger Attention Flows: Analyzing and Comparing Attention Mechanisms in Language Models. Search on Bibsonomy IEEE Trans. Vis. Comput. Graph. The full citation details ... 2021 DBLP  DOI  BibTeX  RDF
10Kaihao Zhang, Dongxu Li, Wenhan Luo, Wenqi Ren, Lin Ma 0002, Hongdong Li Dual Attention-in-Attention Model for Joint Rain Streak and Raindrop Removal. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Priyabrata Karmakar, Shyh Wei Teng, Guojun Lu Thank you for Attention: A survey on Attention-based Artificial Neural Networks for Automatic Speech Recognition. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Haoyu Chen 0003, Jinjin Gu, Zhi Zhang Attention in Attention Network for Image Super-Resolution. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Zhengyao Wen, Wenzhong Lin, Tao Wang, Ge Xu Distract Your Attention: Multi-head Cross Attention Network for Facial Expression Recognition. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Yu Yan, Jiusheng Chen, Weizhen Qi, Nikhil Bhendawade, Yeyun Gong, Nan Duan, Ruofei Zhang EL-Attention: Memory Efficient Lossless Attention for Generation. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Tristan Gomez, Suiyi Ling, Thomas Fréour, Harold Mouchère Improve the Interpretability of Attention: A Fast, Accurate, and Interpretable High-Resolution Attention Model. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Tom Kersten, Hugh Mee Wong, Jaap Jumelet, Dieuwke Hupkes Attention vs non-attention for a Shapley-based explanation method. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Hongyu Gong, Yun Tang 0002, Juan Miguel Pino, Xian Li Pay Better Attention to Attention: Head Selection in Multilingual and Multi-Domain Sequence Modeling. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Zhongjie Yu 0001, Devendra Singh Dhami, Kristian Kersting Sum-Product-Attention Networks: Leveraging Self-Attention in Probabilistic Circuits. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Zaid Amin, Nazlena Mohamad Ali, Alan F. Smeaton Visual Selective Attention System to Intervene User Attention in Sharing COVID-19 Misinformation. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Rafael Pedro, Arlindo L. Oliveira Assessing the Impact of Attention and Self-Attention Mechanisms on the Classification of Skin Lesions. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Chenglin Yang, Siyuan Qiao, Adam Kortylewski, Alan L. Yuille Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Michael Yeung, Leonardo Rundo, Evis Sala, Carola-Bibiane Schönlieb, Guang Yang 0006 Focal Attention Networks: optimising attention for biomedical image segmentation. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Manjin Kim, Heeseung Kwon, Chunyu Wang, Suha Kwak, Minsu Cho Relational Self-Attention: What's Missing in Attention for Video Understanding. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Alana de Santana Correia, Esther Luna Colombini Attention, please! A survey of Neural Attention Models in Deep Learning. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Yihe Dong, Jean-Baptiste Cordonnier, Andreas Loukas Attention is Not All You Need: Pure Attention Loses Rank Doubly Exponentially with Depth. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Chull Hwan Song, Hye Joo Han, Yannis Avrithis All the attention you need: Global-local, spatial-channel attention for image retrieval. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
10Dinh-Hieu Hoang, Gia-Han Diep, Minh-Triet Tran, Ngan T. H. Le DAM-AL: Dilated Attention Mechanism with Attention Loss for 3D Infant Brain Image Segmentation. Search on Bibsonomy CoRR The full citation details ... 2021 DBLP  BibTeX  RDF
Displaying result #901 - #1000 of 47874 (100 per page; Change: )
Pages: [<<][1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][18][19][>>]
Valid XHTML 1.1! Valid CSS! [Valid RSS]
Maintained by L3S.
Previously maintained by Jörg Diederich.
Based upon DBLP by Michael Ley.
open data data released under the ODC-BY 1.0 license