The FacetedDBLP logo    Search for: in:

Disable automatic phrases ?     Syntactic query expansion: ?

Searching for attention with no syntactic query expansion in all metadata.

Publication years (Num. hits)
1959-1968 (16) 1969-1975 (24) 1976-1977 (15) 1978-1980 (26) 1981-1982 (16) 1983-1984 (16) 1985-1986 (30) 1987 (30) 1988 (50) 1989 (46) 1990 (60) 1991 (37) 1992 (48) 1993 (56) 1994 (70) 1995 (99) 1996 (137) 1997 (163) 1998 (182) 1999 (309) 2000 (373) 2001 (461) 2002 (578) 2003 (752) 2004 (1043) 2005 (1320) 2006 (1510) 2007 (1819) 2008 (1794) 2009 (1337) 2010 (481) 2011 (304) 2012 (313) 2013 (368) 2014 (331) 2015 (394) 2016 (582) 2017 (955) 2018 (1733) 2019 (3186) 2020 (4271) 2021 (5621) 2022 (6847) 2023 (8013) 2024 (2058)
Publication types (Num. hits)
article(22866) book(3) data(2) incollection(128) inproceedings(24715) phdthesis(156) proceedings(4)
Venues (Conferences, Journals, ...)
CoRR(6921) IEEE Access(1116) Sensors(542) ICASSP(509) Neurocomputing(473) Remote. Sens.(456) IJCNN(414) NeuroImage(375) AAAI(350) IEEE Trans. Geosci. Remote. Se...(345) CVPR(340) Multim. Tools Appl.(316) ICIP(306) Appl. Intell.(276) Interspeech(272) CogSci(261) More (+10 of total 4647)
GrowBag graphs for keyword ? (Num. hits/coverage)

Group by:
The graphs summarize 13907 occurrences of 6353 keywords

Results
Found 47874 publication records. Showing 47874 according to the selection in the facets
Hits ? Authors Title Venue Year Link Author keywords
10Jiangtao Zhang, Qi Wang 0039, Qingshan Wang, Zhiwen Zheng Multimodal Fusion Framework Based on Statistical Attention and Contrastive Attention for Sign Language Recognition. Search on Bibsonomy IEEE Trans. Mob. Comput. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Nikolaos Mylonas, Ioannis Mollas, Grigorios Tsoumakas An attention matrix for every decision: faithfulness-based arbitration among multiple attention-based interpretations of transformers in text classification. Search on Bibsonomy Data Min. Knowl. Discov. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Athena Stein, Jacob R. Thorstensen, Jonathan M. Ho, Daniel P. Ashley, Kartik K. Iyer, Karen M. Barlow Attention Please! Unravelling the Link Between Brain Network Connectivity and Cognitive Attention Following Acquired Brain Injury: A Systematic Review of Structural and Functional Measures. Search on Bibsonomy Brain Connect. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Jimoon Kim, Sanghwa Hong, Min-Keun Song, Kibum Kim Visual Attention and Pulmonary VR Training System for Children With Attention Deficit Hyperactivity Disorder. Search on Bibsonomy IEEE Access The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Avi Deb Raha, Mrityunjoy Gain, Rameswar Debnath, Apurba Adhikary, Yu Qiao 0004, Md. Mehedi Hassan, Anupam Kumar Bairagi, Sheikh Mohammed Shariful Islam Attention to Monkeypox: An Interpretable Monkeypox Detection Technique Using Attention Mechanism. Search on Bibsonomy IEEE Access The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Zhehang Lou, Suyun Luo, Xiaoci Huang, Dan Wei Conv-Attention: A Low Computation Attention Calculation Method for Swin Transformer. Search on Bibsonomy Neural Process. Lett. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Ge Tang, Xinyu Wang 0003, Hengwei Zhao, Xin Hu, Guang Jin, Yanfei Zhong Attention in Attention for Hyperspectral With High Spatial Resolution (H) Image Classification. Search on Bibsonomy IEEE Trans. Geosci. Remote. Sens. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Swalpa Kumar Roy, Atri Sukul, Ali Jamali, Juan Mario Haut, Pedram Ghamisi Cross Hyperspectral and LiDAR Attention Transformer: An Extended Self-Attention for Land Use and Land Cover Classification. Search on Bibsonomy IEEE Trans. Geosci. Remote. Sens. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Yichen Zhao, Yaxiong Chen, Yi Rong, Shengwu Xiong, Xiaoqiang Lu Global-Group Attention Network With Focal Attention Loss for Aerial Scene Classification. Search on Bibsonomy IEEE Trans. Geosci. Remote. Sens. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Yinchao Gu, Chao Ma 0004, Qian Li, Xiaokang Yang 3D Lane Detection With Attention in Attention. Search on Bibsonomy IEEE Signal Process. Lett. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Zachary Smith, Michael Strobel, Bodhi P. Vani, Pratyush Tiwary Graph Attention Site Prediction (GrASP): Identifying Druggable Binding Sites Using Graph Neural Networks with Attention. Search on Bibsonomy J. Chem. Inf. Model. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Qingtian Wu, Yu Zhang 0125, Liming Zhang 0002, Haoyong Yu Parallel Self-Attention and Spatial-Attention Fusion for Human Pose Estimation and Running Movement Recognition. Search on Bibsonomy IEEE Trans. Cogn. Dev. Syst. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Somiya Rani, Amita Jain, Akshi Kumar 0001, Guang Yang 0006 CCheXR-Attention: Clinical concept extraction and chest x-ray reports classification using modified Mogrifier and bidirectional LSTM with multihead attention. Search on Bibsonomy Int. J. Imaging Syst. Technol. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Serdar Erisen SERNet-Former: Semantic Segmentation by Efficient Residual Network with Attention-Boosting Gates and Attention-Fusion Networks. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Saebom Leem, Hyunseok Seo Attention Guided CAM: Visual Explanations of Vision Transformer Guided by Self-Attention. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Ali Hassani 0001, Wen-Mei Hwu, Humphrey Shi Faster Neighborhood Attention: Reducing the O(n^2) Cost of Self Attention at the Threadblock Level. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Barna Saha, Christopher Ye The I/O Complexity of Attention, or How Optimal is Flash Attention? Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Franck Michel, Fabien Gandon Pay Attention: a Call to Regulate the Attention Market and Prevent Algorithmic Emotional Governance. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Zihang Song, Prabodh Katti, Osvaldo Simeone, Bipin Rajendran Stochastic Spiking Attention: Accelerating Attention with Stochastic Computing in Spiking Networks. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Shrimon Mukherjee, Pulakesh Pramanik, Partha Basuchowdhuri, Santanu Bhattacharya G4-Attention: Deep Learning Model with Attention for predicting DNA G-Quadruplexes. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Tianyi Zhang, Jonah Wonkyu Yi, Bowen Yao, Zhaozhuo Xu, Anshumali Shrivastava NoMAD-Attention: Efficient LLM Inference on CPUs Through Multiply-add-free Attention. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Yuting Ding, Fei Chen TAnet: A New Temporal Attention Network for EEG-based Auditory Spatial Attention Decoding with a Short Decision Window. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10R. Gnana Praveen, Jahangir Alam Cross-Attention is Not Always Needed: Dynamic Cross-Attention for Audio-Visual Dimensional Emotion Recognition. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Saman Khamesian, Hamed Malek Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs. Search on Bibsonomy Evol. Syst. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Shuhan Qi, Xinhao Huang, Peixi Peng, Xuzhong Huang, Jiajia Zhang, Xuan Wang 0002 Cascaded Attention: Adaptive and Gated Graph Attention Network for Multiagent Reinforcement Learning. Search on Bibsonomy IEEE Trans. Neural Networks Learn. Syst. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Saebom Leem, Hyunseok Seo Attention Guided CAM: Visual Explanations of Vision Transformer Guided by Self-Attention. Search on Bibsonomy AAAI The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Danilo Dordevic, Vukasin Bozic, Joseph Thommes, Daniele Coppola, Sidak Pal Singh Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers (Student Abstract). Search on Bibsonomy AAAI The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Reza Azad, Leon Niggemeier, Michael Hüttemann, Amirhossein Kazerouni, Ehsan Khodapanah Aghdam, Yury Velichko, Ulas Bagci, Dorit Merhof Beyond Self-Attention: Deformable Large Kernel Attention for Medical Image Segmentation. Search on Bibsonomy WACV The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Jing Zhang, Huan Sun, Joyce C. Ho EMBA: Entity Matching using Multi-Task Learning of BERT with Attention-over-Attention. Search on Bibsonomy EDBT The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
10Rui Fu, Tao Huang, Mingyue Li, Qinyu Sun, Yunxing Chen A multimodal deep neural network for prediction of the driver's focus of attention based on anthropomorphic attention mechanism and prior knowledge. Search on Bibsonomy Expert Syst. Appl. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Wei Zhou 0042, Zhiwu Xia, Peng Dou, Tao Su, Haifeng Hu 0001 Double Attention Based on Graph Attention Network for Image Multi-Label Classification. Search on Bibsonomy ACM Trans. Multim. Comput. Commun. Appl. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Tianpeng Pan, Lili Zhang 0005, Yingchao Song, Yuxuan Liu Hybrid Attention Compression Network With Light Graph Attention Module for Remote Sensing Images. Search on Bibsonomy IEEE Geosci. Remote. Sens. Lett. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Chenbin Liang, Baihua Xiao, Bo Cheng 0005, Yunyun Dong XANet: An Efficient Remote Sensing Image Segmentation Model Using Element-Wise Attention Enhancement and Multi-Scale Attention Fusion. Search on Bibsonomy Remote. Sens. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Congping Chen, Jing Shi, Zhiwei Xu, Zhihan Wang Attention UNet3+: a full-scale connected attention-aware UNet for CT image segmentation of liver. Search on Bibsonomy J. Electronic Imaging The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhonghua Sun 0003, Tianyi Wang, Meng Dai Combining channel-wise joint attention and temporal attention in graph convolutional networks for skeleton-based action recognition. Search on Bibsonomy Signal Image Video Process. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Baoyu Cui, Sumin Qi, Jing Meng, Yingjie Jiang Joint 2D attention gate and channel-spatial attention network for retinal vessel segmentation of OCT-angiography images. Search on Bibsonomy Signal Image Video Process. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Lei Ju 0005, Josef Kittler, Muhammad Awais Rana, Wankou Yang, Zhen-Hua Feng Keep an eye on faces: Robust face detection with heatmap-Assisted spatial attention and scale-Aware layer attention. Search on Bibsonomy Pattern Recognit. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Wei Wu, Zhan Li, Takumi Miura, Yasuhiro Hatori, Chia-huei Tseng, Ichiro Kuriki, Yoshiyuki Sato, Satoshi Shioiri Different Mechanisms for Visual Attention at the Hand-movement Goal and Endogenous Visual Attention. Search on Bibsonomy J. Cogn. Neurosci. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yadong Shao, Aodi Wu, Shengyang Li, Leizheng Shu, Xue Wan, Yuanbin Shao, Junyan Huo Satellite Component Semantic Segmentation: Video Dataset and Real-Time Pyramid Attention and Decoupled Attention Network. Search on Bibsonomy IEEE Trans. Aerosp. Electron. Syst. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Debin Wei, Pinru Li, Hongji Xie, Yongqiang Xu DRCNet: Road Extraction From Remote Sensing Images Using DenseNet With Recurrent Criss-Cross Attention and Convolutional Block Attention Module. Search on Bibsonomy IEEE Access The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Runwei Guan, Ka Lok Man, Haocheng Zhao, Ruixiao Zhang, Shanliang Yao, Jeremy S. Smith, Eng Gee Lim, Yutao Yue MAN and CAT: mix attention to nn and concatenate attention to YOLO. Search on Bibsonomy J. Supercomput. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Anurag Singh Tomar, K. V. Arya, Shyam Singh Rajput Deep HyFeat Based Attention in Attention Model for Face Super-Resolution. Search on Bibsonomy IEEE Trans. Instrum. Meas. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Mingyi Sun, Weigang Cui, Yue Zhang, Shuyue Yu, Xiaofeng Liao 0001, Bin Hu 0001, Yang Li 0010 Attention-Rectified and Texture-Enhanced Cross-Attention Transformer Feature Fusion Network for Facial Expression Recognition. Search on Bibsonomy IEEE Trans. Ind. Informatics The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhengyi Guan, Xiaobing Zhou A prefix and attention map discrimination fusion guided attention for biomedical named entity recognition. Search on Bibsonomy BMC Bioinform. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhenning Zhou, Xiaoxiao Zhu, Qixin Cao AAGDN: Attention-Augmented Grasp Detection Network Based on Coordinate Attention and Effective Feature Fusion Method. Search on Bibsonomy IEEE Robotics Autom. Lett. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhihua Tao, Chunping Ouyang, Yongbin Liu, Tong Lee Chung, Yixin Cao 0002 Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network. Search on Bibsonomy CAAI Trans. Intell. Technol. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Pierre Baldi, Roman Vershynin The quarks of attention: Structure and capacity of neural attention building blocks. Search on Bibsonomy Artif. Intell. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Siqi Liu, Jiangshu Wei, Gang Liu, Bei Zhou Image classification model based on large kernel attention mechanism and relative position self-attention mechanism. Search on Bibsonomy PeerJ Comput. Sci. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Shilei Yan, Yujuan Qi, Mengxue Liu, Yan-Jiang Wang 0001, Baodi Liu Object tracking based on siamese network with 3D attention and multiple graph attention. Search on Bibsonomy Comput. Vis. Image Underst. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Xin Lin, Li Zhu, Shuyu Yang, Yaxiong Wang Diff attention: A novel attention scheme for person re-identification. Search on Bibsonomy Comput. Vis. Image Underst. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Bo Jiang, Yao Lu, Xiaosheng Chen, Xinhai Lu, Guangming Lu Graph Attention in Attention Network for Image Denoising. Search on Bibsonomy IEEE Trans. Syst. Man Cybern. Syst. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Ardiansyah Koeshidayatullah, Ivan Ferreira-Chacua, Weichang Li Is attention all geosciences need? Advancing quantitative petrography with attention-based deep learning. Search on Bibsonomy Comput. Geosci. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Jingxuan Geng, Chunhua Yang 0001, Lijuan Lan, Yonggang Li, Jie Han, Can Zhou A novel total nitrogen prediction method based on recurrent neural networks utilizing cross-coupling attention and selective attention. Search on Bibsonomy Neurocomputing The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10K. H. Vijaya Kumari, Soubhagya Sankar Barpanda Residual UNet with Dual Attention - An ensemble residual UNet with dual attention for multi-modal and multi-class brain MRI segmentation. Search on Bibsonomy Int. J. Imaging Syst. Technol. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Roop Ranjan, A. K. Daniel 0001 CoBiAt: A Sentiment ClassificatiCobiat: A Sentiment Classification Model Using Hybrid Convnet- Dual-lstm with Attention Mechanismon Model using Hybrid ConvNet- Dual-LSTM with Attention Mechanism. Search on Bibsonomy Informatica (Slovenia) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Xiangbin Liu, Shufen Hou, Shuai Liu 0002, Weiping Ding 0001, Yudong Zhang 0001 Attention-based multimodal glioma segmentation with multi-attention layers for small-intensity dissimilarity. Search on Bibsonomy J. King Saud Univ. Comput. Inf. Sci. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yanlei Kang, Arne Elofsson, Yunliang Jiang, Weihong Huang, Minzhe Yu, Zhong Li AFTGAN: prediction of multi-type PPI based on attention free transformer and graph attention network. Search on Bibsonomy Bioinform. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhuoyi Lin, Sheng Zang, Rundong Wang, Zhu Sun, J. Senthilnath 0001, Chi Xu, Chee Keong Kwoh 0001 Attention Over Self-Attention: Intention-Aware Re-Ranking With Dynamic Transformer Encoders for Recommendation. Search on Bibsonomy IEEE Trans. Knowl. Data Eng. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Bosheng Qin, Haoji Hu, Yueting Zhuang Deep Residual Weight-Sharing Attention Network With Low-Rank Attention for Visual Question Answering. Search on Bibsonomy IEEE Trans. Multim. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Bonan Kou, Shengmai Chen, Zhijie Wang, Lei Ma 0003, Tianyi Zhang 0001 Is Model Attention Aligned with Human Attention? An Empirical Study on Large Language Models for Code Generation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Kin Wai Lau, Lai-Man Po, Yasar Abbas Ur Rehman Large Separable Kernel Attention: Rethinking the Large Kernel Attention Design in CNN. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yingyi Chen, Qinghua Tao, Francesco Tonin, Johan A. K. Suykens Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Nicolas Makaroff, Laurent D. Cohen Chan-Vese Attention U-Net: An attention mechanism for robust segmentation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Shashanka Venkataramanan, Amir Ghodrati, Yuki M. Asano, Fatih Porikli, Amirhossein Habibian Skip-Attention: Improving Vision Transformers by Paying Less Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Syed Farhan Abbas, Nguyen Thanh Duc, Yoonguu Song, Kyungwon Kim, Boreom Lee CV-Attention UNet: Attention-based UNet for 3D Cerebrovascular Segmentation of Enhanced TOF-MRA Images. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Mansi Sakarvadia, Arham Khan, Aswathy Ajith, Daniel Grzenda, Nathaniel Hudson, André Bauer 0001, Kyle Chard, Ian T. Foster Attention Lens: A Tool for Mechanistically Interpreting the Attention Head Information Retrieval Mechanism. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Reza Azad, Leon Niggemeier, Michael Huttemann, Amirhossein Kazerouni, Ehsan Khodapanah Aghdam, Yury Velichko, Ulas Bagci, Dorit Merhof Beyond Self-Attention: Deformable Large Kernel Attention for Medical Image Segmentation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Elizaveta Kovtun, Galina Boeva, Artem Zabolotnyi, Evgeny Burnaev, Martin Spindler, Alexey Zaytsev 0002 Label Attention Network for sequential multi-label classification: you were looking at a wrong self-attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Marcial Sanchis-Agudo, Yuning Wang, Karthik Duraisamy, Ricardo Vinuesa Easy attention: A simple self-attention mechanism for Transformers. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Jaewook Lee, Seongmin Heo, Jay H. Lee Attention Mechanism for Lithium-Ion Battery Lifespan Prediction: Temporal and Cyclic Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zeping Min Attention Link: An Efficient Attention-Based Low Resource Machine Translation Architecture. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Paria Mehrani, John K. Tsotsos Self-attention in Vision Transformers Performs Perceptual Grouping, Not Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Daniel Biermann, Fabrizio Palumbo, Morten Goodwin, Ole-Christoffer Granmo Harnessing Attention Mechanisms: Efficient Sequence Reduction using Attention-based Autoencoders. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Dongchen Han, Tianzhu Ye, Yizeng Han, Zhuofan Xia, Shiji Song, Gao Huang Agent Attention: On the Integration of Softmax and Linear Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Hongjun Wang, Jiyuan Chen, Lun Du, Qiang Fu, Shi Han, Xuan Song Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Muhan Zhang Neural Attention: Enhancing QKV Calculation in Self-Attention Mechanism with Neural Networks. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10William Brandon, Aniruddha Nrusimha, Kevin Qian, Zachary Ankner, Tian Jin, Zhiye Song, Jonathan Ragan-Kelley Striped Attention: Faster Ring Attention for Causal Transformers. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Robert Huben, Valerie Morris Attention-Only Transformers and Implementing MLPs with Attention Heads. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Huiyin Xue, Nikolaos Aletras Pit One Against Many: Leveraging Attention-head Embeddings for Parameter-efficient Multi-head Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Zhibo Yang 0002, Sounak Mondal, Seoyoung Ahn, Gregory J. Zelinsky, Minh Hoai, Dimitris Samaras Predicting Human Attention using Computational Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Pablo Cesar Quihui-Rubio, Daniel Flores-Araiza, Miguel González-Mendoza 0001, Christian Mata, Gilberto Ochoa-Ruiz FAU-Net: An Attention U-Net Extension with Feature Pyramid Attention for Prostate Cancer Segmentation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Ying Mo, Hongyin Tang, Jiahao Liu, Qifan Wang, Zenglin Xu, Jingang Wang, Wei Wu, Zhoujun Li 0001 Multi-task Transformer with Relation-attention and Type-attention for Named Entity Recognition. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Heejun Lee, Jina Kim, Jeffrey Willette, Sung Ju Hwang SEA: Sparse Linear Attention with Estimated Attention Mask. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Mootez Saad, Tushar Sharma 0001 Naturalness of Attention: Revisiting Attention in Code Language Models. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Matteo Pagliardini, Daniele Paliotta, Martin Jaggi, François Fleuret Faster Causal Attention Over Large Sequences Through Sparse Flash Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Salwa K. Al Khatib, Mohamed El Amine Boudjoghra, Jameel Hassan Drawing Attention to Detail: Pose Alignment through Self-Attention for Fine-Grained Object Classification. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Shanshan Zhong, Wushao Wen, Jinghui Qin, Qiangpu Chen, Zhongzhan Huang LSAS: Lightweight Sub-attention Strategy for Alleviating Attention Bias Problem. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Huawen Feng, Zhenxi Lin, Qianli Ma 0001 Perturbation-based Self-supervised Attention for Attention Bias in Text Classification. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Aaditya Kharel, Manas Paranjape, Aniket Bera DF-TransFusion: Multimodal Deepfake Detection via Lip-Audio Cross-Attention and Facial Self-Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yandan Tan, Hongbin Zhu, Jie Wu, Hongfeng Chai Tab-Attention: Self-Attention-based Stacked Generalization for Imbalanced Credit Default Prediction. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Linzhi Huang, Mei Wang, Jiahao Liang, Weihong Deng, Hongzhi Shi, Dongchao Wen, Yingjie Zhang, Jian Zhao Gradient Attention Balance Network: Mitigating Face Recognition Racial Bias via Gradient Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Vukasin Bozic, Danilo Dordevic, Daniele Coppola, Joseph Thommes, Sidak Pal Singh Rethinking Attention: Exploring Shallow Feed-Forward Neural Networks as an Alternative to Attention Layers in Transformers. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Chi-En Amy Tai, Elizabeth Janes, Chris Czarnecki, Alexander Wong Double-Condensing Attention Condenser: Leveraging Attention in Deep Learning to Detect Skin Cancer from Skin Lesion Images. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yanming Kang, Giang Tran, Hans De Sterck Fast Multipole Attention: A Divide-and-Conquer Attention Mechanism for Long Sequences. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Lev V. Utkin, Andrei V. Konstantinov, Stanislav R. Kirpichenko Attention and self-attention in random forests. Search on Bibsonomy Prog. Artif. Intell. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Wei Tao, Chang Li 0001, Rencheng Song, Juan Cheng, Yu Liu 0023, Feng Wan, Xun Chen 0001 EEG-Based Emotion Recognition via Channel-Wise Attention and Self Attention. Search on Bibsonomy IEEE Trans. Affect. Comput. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Paria Mehrani, John K. Tsotsos Self-attention in vision transformers performs perceptual grouping, not attention. Search on Bibsonomy Frontiers Comput. Sci. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Meng-Hao Guo, Zheng-Ning Liu, Tai-Jiang Mu, Shi-Min Hu 0001 Beyond Self-Attention: External Attention Using Two Linear Layers for Visual Tasks. Search on Bibsonomy IEEE Trans. Pattern Anal. Mach. Intell. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Juan Chen, Nannan Xi, Vilma Pohjonen, Juho Hamari Paying attention in metaverse: an experiment on spatial attention allocation in extended reality shopping. Search on Bibsonomy Inf. Technol. People The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
10Yanlin Wu, Guanglei Wang, Zhongyang Wang, Hongrui Wang, Yan Li Triplet attention fusion module: A concise and efficient channel attention module for medical image segmentation. Search on Bibsonomy Biomed. Signal Process. Control. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
Displaying result #701 - #800 of 47874 (100 per page; Change: )
Pages: [<<][1][2][3][4][5][6][7][8][9][10][11][12][13][14][15][16][17][>>]
Valid XHTML 1.1! Valid CSS! [Valid RSS]
Maintained by L3S.
Previously maintained by Jörg Diederich.
Based upon DBLP by Michael Ley.
open data data released under the ODC-BY 1.0 license