The FacetedDBLP logo    Search for: in:

Disable automatic phrases ?     Syntactic query expansion: ?

Searching for phrase Mixture-of-experts (changed automatically) with no syntactic query expansion in all metadata.

Publication years (Num. hits)
1995-1997 (18) 1998-2001 (15) 2002-2004 (23) 2005-2006 (28) 2007 (22) 2008 (17) 2009-2010 (27) 2011-2012 (24) 2013-2014 (16) 2015-2016 (23) 2017 (17) 2018 (25) 2019 (27) 2020 (33) 2021 (47) 2022 (88) 2023 (117) 2024 (64)
Publication types (Num. hits)
article(335) incollection(3) inproceedings(288) phdthesis(5)
Venues (Conferences, Journals, ...)
CoRR(215) NeurIPS(16) ICASSP(9) AAAI(8) IJCNN(8) Neurocomputing(7) INTERSPEECH(6) CVPR(5) EMNLP(5) EMNLP (Findings)(5) ICML(5) ICNN(5) ESANN(4) Expert Syst. Appl.(4) ICME(4) IEEE Access(4) More (+10 of total 251)
GrowBag graphs for keyword ? (Num. hits/coverage)

Group by:
The graphs summarize 83 occurrences of 62 keywords

Results
Found 631 publication records. Showing 631 according to the selection in the facets
Hits ? Authors Title Venue Year Link Author keywords
18Shibal Ibrahim, Wenyu Chen, Hussein Hazimeh 0001, Natalia Ponomareva 0001, Zhe Zhao 0001, Rahul Mazumder COMET: Learning Cardinality Constrained Mixture of Experts with Trees and Local Search. Search on Bibsonomy KDD The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Keyao Li, Jungang Xu AC-MMOE: A Multi-gate Mixture-of-experts Model Based on Attention and Convolution. Search on Bibsonomy INNS DLIA@IJCNN The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Mengzhu Wang, Jianlong Yuan, Zhibin Wang Mixture-of-Experts Learner for Single Long-Tailed Domain Generalization. Search on Bibsonomy ACM Multimedia The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yue Zhao 0016, Guoqing Zheng, Subhabrata Mukherjee, Robert McCann, Ahmed Awadallah 0001 ADMoE: Anomaly Detection with Mixture-of-Experts from Noisy Labels. Search on Bibsonomy AAAI The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Xinjie Li, Huijuan Xu 0001 MEID: Mixture-of-Experts with Internal Distillation for Long-Tailed Video Recognition. Search on Bibsonomy AAAI The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Dongjin Jeong, Jinyeong Bak Conversational Emotion-Cause Pair Extraction with Guided Mixture of Experts. Search on Bibsonomy EACL The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Bo Xu 0023, Shizhou Huang, Ming Du, Hongya Wang, Hui Song, Yanghua Xiao, Xin Lin A Unified Visual Prompt Tuning Framework with Mixture-of-Experts for Multimodal Information Extraction. Search on Bibsonomy DASFAA (3) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Hanwen Liu, Yibing Zhang, Ximeng Wang, Bin Wang, Yanwei Yu ST-MoE: Spatio-Temporal Mixture of Experts for Multivariate Time Series Forecasting. Search on Bibsonomy ISKE The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Siddharth Singh, Olatunji Ruwase, Ammar Ahmad Awan, Samyam Rajbhandari, Yuxiong He, Abhinav Bhatele A Hybrid Tensor-Expert-Data Parallelism Approach to Optimize Mixture-of-Experts Training. Search on Bibsonomy ICS The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Rishov Sarkar, Hanxue Liang, Zhiwen Fan, Zhangyang Wang, Cong Hao Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-Level Sparsity via Mixture-of-Experts. Search on Bibsonomy ICCAD The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shaohuai Shi, Xinglin Pan, Xiaowen Chu 0001, Bo Li 0001 PipeMoE: Accelerating Mixture-of-Experts through Adaptive Pipelining. Search on Bibsonomy INFOCOM The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Elvira Fleig, Jonas Geistert, Erik Bochinski, Rolf Jongebloed, Thomas Sikora Edge-Aware Autoencoder Design for Real-Time Mixture-of-Experts Image Compression. Search on Bibsonomy ISCAS The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jiamin Li, Qiang Su, Yitao Yang, Yimin Jiang, Cong Wang 0001, Hong Xu 0001 Adaptive Gating in Mixture-of-Experts based Language Models. Search on Bibsonomy EMNLP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Truong Do, Le Khiem, Quang Pham, TrungTin Nguyen, Thanh-Nam Doan, Binh Nguyen, Chenghao Liu, Savitha Ramasamy, Xiaoli Li 0001, Steven C. H. Hoi HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of Experts. Search on Bibsonomy EMNLP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Sheng Shen, Zhewei Yao, Chunyuan Li, Trevor Darrell, Kurt Keutzer, Yuxiong He Scaling Vision-Language Models with Sparse Mixture of Experts. Search on Bibsonomy EMNLP (Findings) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Nishanth Dikkala, Nikhil Ghosh, Raghu Meka, Rina Panigrahy, Nikhil Vyas 0001, Xin Wang 0016 On the Benefits of Learning to Route in Mixture-of-Experts Models. Search on Bibsonomy EMNLP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Hassan Akbari, Dan Kondratyuk, Yin Cui, Rachel Hornung, Huisheng Wang, Hartwig Adam Alternating Gradient Descent and Mixture-of-Experts for Integrated Multimodal Perception. Search on Bibsonomy NeurIPS The full citation details ... 2023 DBLP  BibTeX  RDF
18Huy Nguyen, TrungTin Nguyen, Nhat Ho Demystifying Softmax Gating Function in Gaussian Mixture of Experts. Search on Bibsonomy NeurIPS The full citation details ... 2023 DBLP  BibTeX  RDF
18Haotao Wang, Ziyu Jiang, Yuning You, Yan Han, Gaowen Liu, Jayanth Srinivasa, Ramana Kompella, Zhangyang Wang Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit Diversity Modeling. Search on Bibsonomy NeurIPS The full citation details ... 2023 DBLP  BibTeX  RDF
18Iago Pachêco Gomes, Cristiano Premebida, Denis Fernando Wolf Interaction-Aware Maneuver Intention Prediction Using Gated Mixture-of-Experts Attention Mechanism. Search on Bibsonomy ITSC The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zhongqiu Li, Yu Hong, Shiming He, Shuai Yang, Guodong Zhou A Mixture of Experts with Adaptive Semantic Encoding for Event Detection. Search on Bibsonomy IJCNN The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Ziqing Xia, Justin JiXi Wen, Chun-Hsien Chen, Meng-Hsueh Hsieh, Jo-Yu Kuo EEG-Based Stress Recognition Through the Integration of Convolutional Neural Networks and Mixture of Experts Ensemble Modelling. Search on Bibsonomy TE The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Damai Dai, Wenbin Jiang, Jiyuan Zhang, Yajuan Lyu, Zhifang Sui, Baobao Chang Mixture-of-Experts for Biomedical Question Answering. Search on Bibsonomy NLPCC (1) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Fanyi Qu, Che Wang, Yunfang Wu Accurate, Diverse and Multiple Distractor Generation with Mixture of Experts. Search on Bibsonomy NLPCC (1) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Tobias Kleanthous, Antonio Martini Learning Robust and Scalable Motion Matching with Lipschitz Continuity and Sparse Mixture of Experts. Search on Bibsonomy MIG The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Lynnette Hui Xian Ng, Kathleen M. Carley BotBuster: Multi-Platform Bot Detection Using a Mixture of Experts. Search on Bibsonomy ICWSM The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Aran Komatsuzaki, Joan Puigcerver, James Lee-Thorp, Carlos Riquelme Ruiz, Basil Mustafa, Joshua Ainslie, Yi Tay, Mostafa Dehghani 0001, Neil Houlsby Sparse Upcycling: Training Mixture-of-Experts from Dense Checkpoints. Search on Bibsonomy ICLR The full citation details ... 2023 DBLP  BibTeX  RDF
18Bo Li 0125, Yifei Shen, Jingkang Yang, Yezhen Wang, Jiawei Ren, Tong Che, Jun Zhang, Ziwei Liu 0002 Sparse Mixture-of-Experts are Domain Generalizable Learners. Search on Bibsonomy ICLR The full citation details ... 2023 DBLP  BibTeX  RDF
18Zhenxing Mi, Dan Xu 0002 Switch-NeRF: Learning Scene Decomposition with Mixture of Experts for Large-scale Neural Radiance Fields. Search on Bibsonomy ICLR The full citation details ... 2023 DBLP  BibTeX  RDF
18Mohammed Nowaz Rabbani Chowdhury, Shuai Zhang 0015, Meng Wang 0003, Sijia Liu 0001, Pin-Yu Chen Patch-level Routing in Mixture-of-Experts is Provably Sample-efficient for Convolutional Neural Networks. Search on Bibsonomy ICML The full citation details ... 2023 DBLP  BibTeX  RDF
18Shubham Sharma 0002, Jette Henderson, Joydeep Ghosh FEAMOE: Fair, Explainable and Adaptive Mixture of Experts. Search on Bibsonomy IJCAI The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Benjamin Fraser, Adolfo Perrusquía, Dimitrios Panagiotakopoulos, Weisi Guo A Deep Mixture of Experts Network for Drone Trajectory Intent Classification and Prediction using Non-Cooperative Radar Data. Search on Bibsonomy SSCI The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Kizito Masaba, Alberto Quattrini Li Multi-Robot Adaptive Sampling based on Mixture of Experts Approach to Modeling Non-Stationary Spatial Fields. Search on Bibsonomy MRS The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18TrungTin Nguyen, Dung Ngoc Nguyen, Hien Duy Nguyen, Faicel Chamroukhi A Non-asymptotic Risk Bound for Model Selection in a High-Dimensional Mixture of Experts via Joint Rank and Variable Selection. Search on Bibsonomy AI (2) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Aytaç Özkan, Yi-Hsin Li, Thomas Sikora Steered-Mixture-of-Experts Regression for Image Denoising with Multi-Model Inference. Search on Bibsonomy EUSIPCO The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Ganesh Jawahar, Subhabrata Mukherjee, Xiaodong Liu, Young Jin Kim, Muhammad Abdul-Mageed, Laks V. S. Lakshmanan, Ahmed Hassan Awadallah, Sébastien Bubeck, Jianfeng Gao 0001 AutoMoE: Heterogeneous Mixture-of-Experts with Adaptive Computation for Efficient Neural Machine Translation. Search on Bibsonomy ACL (Findings) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Mohsen Mohammadzadeh, Ehsan Akbari, Anas A. Salameh, Mojtaba Ghadamyari, Sasan Pirouzi, Tomonobu Senjyu Application of Mixture of Experts in Machine Learning-Based Controlling of DC-DC Power Electronics Converter. Search on Bibsonomy IEEE Access The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jakob Thumm, Marcel Reith-Braun, Florian Pfaff, Uwe D. Hanebeck, Merle Flitter, Georg Maier, Robin Gruna, Thomas Längle, Albert Bauer, Harald Kruggel-Emden Mixture of Experts of Neural Networks and Kalman Filters for Optical Belt Sorting. Search on Bibsonomy IEEE Trans. Ind. Informatics The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zihang Li, Xiaohong Chen 0003, Jingye Li, Jian Zhang Pertinent Multigate Mixture-of-Experts-Based Prestack Three-Parameter Seismic Inversion. Search on Bibsonomy IEEE Trans. Geosci. Remote. Sens. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jialu Wu, Junmei Wang, Zhenxing Wu, Shengyu Zhang, Yafeng Deng, Yu Kang 0002, Dong-Sheng Cao 0001, Chang-Yu Hsieh, Tingjun Hou ALipSol: An Attention-Driven Mixture-of-Experts Model for Lipophilicity and Solubility Prediction. Search on Bibsonomy J. Chem. Inf. Model. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Lianbo Zhang, Shaoli Huang, Wei Liu 0007 Enhancing Mixture-of-Experts by Leveraging Attention for Fine-Grained Recognition. Search on Bibsonomy IEEE Trans. Multim. The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Francisco Souza 0001, Tim Offermans, Ruud Barendse, Geert J. Postma, Jeroen J. Jansen Contextual Mixture of Experts: Integrating Knowledge into Predictive Modeling. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Aya Abdelsalam Ismail, Sercan Ö. Arik, Jinsung Yoon, Ankur Taly, Soheil Feizi, Tomas Pfister Interpretable Mixture of Experts for Structured Data. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Simiao Zuo, Qingru Zhang, Chen Liang 0006, Pengcheng He, Tuo Zhao, Weizhu Chen MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Antoine Maiorca, Nathan Hubens, Sohaib Laraba, Thierry Dutoit Towards Lightweight Neural Animation : Exploration of Neural Network Pruning in Mixture of Experts-based Animation Models. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
18Aran Komatsuzaki, Joan Puigcerver, James Lee-Thorp, Carlos Riquelme Ruiz, Basil Mustafa, Joshua Ainslie, Yi Tay, Mostafa Dehghani 0001, Neil Houlsby Sparse Upcycling: Training Mixture-of-Experts from Dense Checkpoints. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Qianyu Zhou 0001, Ke-Yue Zhang, Taiping Yao, Ran Yi, Shouhong Ding, Lizhuang Ma Adaptive Mixture of Experts Learning for Generalizable Face Anti-Spoofing. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Henok Ghebrechristos, Stence Nicholas, David Mirsky, Gita Alaghband, Manh Huynh, Zackary Kromer, Ligia Batista, Brent O'Neill, Steven Moulton, Daniel M. Lindberg Deep Learning Mixture-of-Experts Approach for Cytotoxic Edema Assessment in Infants and Children. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zixiang Chen, Yihe Deng, Yue Wu, Quanquan Gu, Yuanzhi Li Towards Understanding Mixture of Experts in Deep Learning. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Ze Chen, Kangxu Wang, Zijian Cai, Jiewen Zheng, Jiarong He, Max Gao, Jason Zhang Using Deep Mixture-of-Experts to Detect Word Meaning Shift for TempoWiC. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zitian Chen, Yikang Shen, Mingyu Ding, Zhenfang Chen, Hengshuang Zhao, Erik G. Learned-Miller, Chuang Gan Mod-Squad: Designing Mixture of Experts As Modular Multi-Task Learners. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jingang Qu, Thibault Faney, Ze Wang, Patrick Gallinari, Soleiman Yousef, Jean-Charles de Hemptinne HMOE: Hypernetwork-based Mixture of Experts for Domain Generalization. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Nikoli Dryden, Torsten Hoefler Spatial Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Samyam Rajbhandari, Conglong Li, Zhewei Yao, Minjia Zhang, Reza Yazdani Aminabadi, Ammar Ahmad Awan, Jeff Rasley, Yuxiong He DeepSpeed-MoE: Advancing Mixture-of-Experts Inference and Training to Power Next-Generation AI Scale. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
18Basil Mustafa, Carlos Riquelme, Joan Puigcerver, Rodolphe Jenatton, Neil Houlsby Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Yue Zhao 0016, Guoqing Zheng, Subhabrata Mukherjee, Robert McCann, Ahmed Awadallah 0001 ADMoE: Anomaly Detection with Mixture-of-Experts from Noisy Labels. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Liguang Zhou, Yuhongze Zhou, Tin Lun Lam, Yangsheng Xu Context-aware Mixture-of-Experts for Unbiased Scene Graph Generation. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Mathieu Ravaut, Shafiq R. Joty, Nancy F. Chen SummaReranker: A Multi-Task Mixture-of-Experts Re-ranking Framework for Abstractive Summarization. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Nhat Thien Pham, Faicel Chamroukhi Functional mixture-of-experts for classification. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
18Yanqi Zhou, Tao Lei, Hanxiao Liu, Nan Du 0002, Yanping Huang, Vincent Y. Zhao, Andrew M. Dai, Zhifeng Chen, Quoc Le 0001, James Laudon Mixture-of-Experts with Expert Choice Routing. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  BibTeX  RDF
18Svetlana Pavlitskaya, Christian Hubschneider, Lukas Struppek, J. Marius Zöllner Balancing Expert Utilization in Mixture-of-Experts Layers Embedded in CNNs. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Shashank Gupta, Subhabrata Mukherjee, Krishan Subudhi, Eduardo Gonzalez, Damien Jose, Ahmed Hassan Awadallah, Jianfeng Gao 0001 Sparsely Activated Mixture-of-Experts are Robust Multi-Task Learners. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Stefano Pini, Christian S. Perone, Aayush Ahuja, Ana Sofia Rufino Ferreira, Moritz Niendorf, Sergey Zagoruyko Safe Real-World Autonomous Driving by Learning to Predict and Plan with a Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Tsz Chai Fung, Spark C. Tseung Mixture of experts models for multilevel data: modelling framework and approximation theory. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Xiaohe Ma, Yaxin Yu, Hongzhi Wu, Kun Zhou 0001 Efficient Reflectance Capture with a Deep Gated Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Lynnette Hui Xian Ng, Kathleen M. Carley BotBuster: Multi-platform Bot Detection Using A Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Tianyu Chen, Shaohan Huang, Yuan Xie, Binxing Jiao, Daxin Jiang, Haoyi Zhou, Jianxin Li 0002, Furu Wei Task-Specific Expert Pruning for Sparse Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Shubham Sharma 0002, Jette Henderson, Joydeep Ghosh FEAMOE: Fair, Explainable and Adaptive Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jeongmin Lee 0001, Milos Hauskrecht Learning to Adapt Clinical Sequences with Residual Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Xiaofei Wang, Zhuo Chen 0006, Yu Shi 0001, Jian Wu, Naoyuki Kanda, Takuya Yoshioka Breaking trade-offs in speech separation with sparsely-gated mixture of experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Fabian Tschopp Tuning of Mixture-of-Experts Mixed-Precision Neural Networks. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zewen Chi, Li Dong 0004, Shaohan Huang, Damai Dai, Shuming Ma, Barun Patra, Saksham Singhal, Payal Bajaj, Xia Song, Furu Wei On the Representation Collapse of Sparse Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Hanxue Liang, Zhiwen Fan, Rishov Sarkar, Ziyu Jiang, Tianlong Chen, Kai Zou, Yu Cheng 0001, Cong Hao, Zhangyang Wang M3ViT: Mixture-of-Experts Vision Transformer for Efficient Multi-task Learning with Model-Accelerator Co-design. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Caigao Jiang, Siqiao Xue, James Zhang, Lingyue Liu, Zhibo Zhu, Hongyan Hao Learning Large-scale Universal User Representation with Sparse Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Joan Puigcerver, Rodolphe Jenatton, Carlos Riquelme, Pranjal Awasthi, Srinadh Bhojanapalli On the Adversarial Robustness of Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Liang Shen, Zhihua Wu, Weibao Gong, Hongxiang Hao, Yangfan Bai, HuaChao Wu, Xinxuan Wu, Haoyi Xiong, Dianhai Yu, Yanjun Ma SE-MoE: A Scalable and Efficient Mixture-of-Experts Distributed Training and Inference System. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Ferdinand Kossmann, Zhihao Jia, Alex Aiken Optimizing Mixture of Experts using Dynamic Recompilations. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Yanglan Ou, Ye Yuan, Xiaolei Huang 0001, Stephen T. C. Wong, John Volpi, James Z. Wang 0001, Kelvin K. Wong Patcher: Patch Transformers with Mixture of Experts for Precise Medical Image Segmentation. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Lemeng Wu, Mengchen Liu, Yinpeng Chen, Dongdong Chen 0001, Xiyang Dai, Lu Yuan Residual Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Damai Dai, Wenbin Jiang, Jiyuan Zhang, Weihua Peng, Yajuan Lyu, Zhifang Sui, Baobao Chang, Yong Zhu 0004 Mixture of Experts for Biomedical Question Answering. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Ze-Feng Gao, Peiyu Liu 0002, Wayne Xin Zhao, Zhong-Yi Lu, Ji-Rong Wen Parameter-Efficient Mixture-of-Experts Architecture for Pre-trained Language Models. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Trevor Gale, Deepak Narayanan, Cliff Young, Matei Zaharia MegaBlocks: Efficient Sparse Training with Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Changho Hwang, Wei Cui, Yifan Xiong 0001, Ziyue Yang, Ze Liu, Han Hu 0001, Zilong Wang, Rafael Salas, Jithin Jose, Prabhat Ram, Joe Chau, Peng Cheng 0005, Fan Yang 0024, Mao Yang, Yongqiang Xiong Tutel: Adaptive Mixture-of-Experts at Scale. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Qinyuan Ye, Juan Zha, Xiang Ren 0001 Eliciting Transferability in Multi-task Learning with Task-level Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Yu Qing Zhou, Xixuan Julie Liu, Yuanzhe Dong Build a Robust QA System with Transformer-based Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Xiaobo Xia, Wenhao Yang, Jie Ren, Yewen Li, Yibing Zhan, Bo Han 0003, Tongliang Liu Pluralistic Image Completion with Probabilistic Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Damai Dai, Li Dong 0004, Shuming Ma, Bo Zheng, Zhifang Sui, Baobao Chang, Furu Wei StableMoE: Stable Routing Strategy for Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Chaoyang He 0001, Shuai Zheng 0004, Aston Zhang, George Karypis, Trishul Chilimbi, Mahdi Soltanolkotabi, Salman Avestimehr SMILE: Scaling Mixture-of-Experts with Efficient Bi-level Routing. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Bo Li 0125, Jingkang Yang, Jiawei Ren, Yezhen Wang, Ziwei Liu 0002 Sparse Fusion Mixture-of-Experts are Domain Generalizable Learners. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jannik Wolff, Tassilo Klein, Moin Nabi, Rahul G. Krishnan, Shinichi Nakajima Mixture-of-experts VAEs can disregard variation in surjective multimodal data. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Tao Zhong, Zhixiang Chi, Li Gu, Yang Wang 0003, Yuanhao Yu, Jin Tang Meta-DMoE: Adapting to Domain Shift by Meta-Distillation from Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Yuxuan Zhou 0002, Xien Liu, Kaiyin Zhou, Ji Wu 0002 Table-based Fact Verification with Self-adaptive Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zhijunyi Yang, Mengjie Du, Rongfeng Su, Xiaokang Liu, Nan Yan, Lan Wang A Phone-Level Speaker Embedding Extraction Framework with Multi-Gate Mixture-of-Experts Based Multi-Task Learning. Search on Bibsonomy ISCSLP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Rajarshi Chattopadhyay, Chen-Khong Tham Mixture of Experts based Model Integration for Traffic State Prediction. Search on Bibsonomy VTC Spring The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Zhao You, Shulin Feng, Dan Su 0002, Dong Yu 0001 Speechmoe2: Mixture-of-Experts Model with Improved Routing. Search on Bibsonomy ICASSP The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Simiao Zuo, Qingru Zhang, Chen Liang 0006, Pengcheng He, Tuo Zhao, Weizhu Chen MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation. Search on Bibsonomy NAACL-HLT The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Qianyu Zhou 0001, Ke-Yue Zhang, Taiping Yao, Ran Yi, Shouhong Ding, Lizhuang Ma Adaptive Mixture of Experts Learning for Generalizable Face Anti-Spoofing. Search on Bibsonomy ACM Multimedia The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Jeongmin Lee 0001, Milos Hauskrecht Learning to Adapt Dynamic Clinical Event Sequences with Residual Mixture of Experts. Search on Bibsonomy AIME The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Chengzhe Yang FMP-Net: Fractal Multi-Gate Mixture-of-Experts Panoramic Segmentation for Point Cloud. Search on Bibsonomy ICME The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
18Patrick Petersen, Thomas Rudolf, Eric Sax A Data-driven Energy Estimation based on the Mixture of Experts Method for Battery Electric Vehicles. Search on Bibsonomy VEHITS The full citation details ... 2022 DBLP  DOI  BibTeX  RDF
Displaying result #201 - #300 of 631 (100 per page; Change: )
Pages: [<<][1][2][3][4][5][6][7][>>]
Valid XHTML 1.1! Valid CSS! [Valid RSS]
Maintained by L3S.
Previously maintained by Jörg Diederich.
Based upon DBLP by Michael Ley.
open data data released under the ODC-BY 1.0 license