The FacetedDBLP logo    Search for: in:

Disable automatic phrases ?     Syntactic query expansion: ?

Searching for phrase Mixture-of-experts (changed automatically) with no syntactic query expansion in all metadata.

Publication years (Num. hits)
1995-1997 (18) 1998-2001 (15) 2002-2004 (23) 2005-2006 (28) 2007 (22) 2008 (17) 2009-2010 (27) 2011-2012 (24) 2013-2014 (16) 2015-2016 (23) 2017 (17) 2018 (25) 2019 (27) 2020 (33) 2021 (47) 2022 (88) 2023 (117) 2024 (64)
Publication types (Num. hits)
article(335) incollection(3) inproceedings(288) phdthesis(5)
Venues (Conferences, Journals, ...)
CoRR(215) NeurIPS(16) ICASSP(9) AAAI(8) IJCNN(8) Neurocomputing(7) INTERSPEECH(6) CVPR(5) EMNLP(5) EMNLP (Findings)(5) ICML(5) ICNN(5) ESANN(4) Expert Syst. Appl.(4) ICME(4) IEEE Access(4) More (+10 of total 251)
GrowBag graphs for keyword ? (Num. hits/coverage)

Group by:
The graphs summarize 83 occurrences of 62 keywords

Results
Found 631 publication records. Showing 631 according to the selection in the facets
Hits ? Authors Title Venue Year Link Author keywords
18Bin Lin, Zhenyu Tang, Yang Ye, Jiaxi Cui, Bin Zhu, Peng Jin, Junwu Zhang, Munan Ning, Li Yuan 0001 MoE-LLaVA: Mixture of Experts for Large Vision-Language Models. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Zinan Zeng, Sen Ye, Zijian Cai, Heng Wang, Yuhan Liu, Haokai Zhang, Minnan Luo MMoE: Robust Spoiler Detection with Multi-modal Information and Domain-aware Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Anke Tang, Li Shen, Yong Luo, Nan Yin, Lefei Zhang, Dacheng Tao Merging Multi-Task Models via Weight-Ensembling Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Wenxuan Song, Han Zhao, Pengxiang Ding, Can Cui, Shangke Lyu, Yaning Fan, Donglin Wang GeRM: A Generalist Robotic Model with Mixture-of-experts for Quadruped Robot. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Felix Helfenstein, Jannis Blüml, Johannes Czech, Kristian Kersting Checkmating One, by Using Many: Combining Mixture of Experts with MCTS to Improve in Chess. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Aytaç Özkan, Elena Stoykova, Thomas Sikora, Violeta Madjarova Denoising OCT Images Using Steered Mixture of Experts with Multi-Model Inference. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18James Oldfield 0001, Markos Georgopoulos, Grigorios G. Chrysos, Christos Tzelepis, Yannis Panagakis, Mihalis A. Nicolaou, Jiankang Deng, Ioannis Patras Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Yuejiang Liu, Alexandre Alahi Co-Supervised Learning: Improving Weak-to-Strong Generalization with Hierarchical Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Shanghaoran Quan DMoERM: Recipes of Mixture-of-Experts for Effective Reward Modeling. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Maciej Pióro, Kamil Ciebiera, Krystian Król, Jan Ludziejewski, Sebastian Jaszczur MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Haoyuan Wu, Haisheng Zheng, Bei Yu 0001 Parameter-Efficient Sparsity Crafting from Dense to Mixture-of-Experts for Instruction Tuning on General Tasks. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Pranav Kasela, Gabriella Pasi, Raffaele Perego 0001, Nicola Tonellotto DESIRE-ME: Domain-Enhanced Supervised Information REtrieval using Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Harry Dong, Beidi Chen, Yuejie Chi Prompt-prompted Mixture of Experts for Efficient LLM Generation. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Jinze Zhao, Peihao Wang, Zhangyang Wang Generalization Error Analysis for Sparse Mixture-of-Experts: A Preliminary Study. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Nithin Chalapathi, Yiheng Du, Aditi S. Krishnapriyan Scaling physics-informed hard constraints with mixture-of-experts. Search on Bibsonomy CoRR The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Renteng Yuan, Mohamed A. Abdel-Aty, Qiaojun Xiang, Zijin Wang, Xin Gu A Temporal Multi-Gate Mixture-of-Experts Approach for Vehicle Trajectory and Driving Intention Prediction. Search on Bibsonomy IEEE Trans. Intell. Veh. The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Rongyu Zhang, Yulin Luo, Jiaming Liu, Huanrui Yang, Zhen Dong, Denis A. Gudovskiy, Tomoyuki Okuno, Yohei Nakata, Kurt Keutzer, Yuan Du, Shanghang Zhang Efficient Deweahter Mixture-of-Experts with Uncertainty-Aware Feature-Wise Linear Modulation. Search on Bibsonomy AAAI The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Vedang Asgaonkar, Aditya Jain, Abir De Generator Assisted Mixture of Experts for Feature Acquisition in Batch. Search on Bibsonomy AAAI The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Xuanhua He, Keyu Yan, Rui Li, Chengjun Xie, Jie Zhang 0033, Man Zhou Frequency-Adaptive Pan-Sharpening with Mixture of Experts. Search on Bibsonomy AAAI The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Gwangtae Park, Seokchan Song, Haoyang Sang, Dongseok Im, Donghyeon Han, Sangyeob Kim, Hongseok Lee, Hoi-Jun Yoo 20.8 Space-Mate: A 303.5mW Real-Time Sparse Mixture-of-Experts-Based NeRF-SLAM Processor for Mobile Spatial Computing. Search on Bibsonomy ISSCC The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Pranav Kasela, Gabriella Pasi, Raffaele Perego 0001, Nicola Tonellotto DESIRE-ME: Domain-Enhanced Supervised Information Retrieval Using Mixture-of-Experts. Search on Bibsonomy ECIR (2) The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Shaohuai Shi, Xinglin Pan, Qiang Wang 0022, Chengjian Liu, Xiaozhe Ren, Zhongzhe Hu, Yu Yang, Bo Li 0001, Xiaowen Chu ScheMoE: An Extensible Mixture-of-Experts Distributed Training System with Tasks Scheduling. Search on Bibsonomy EuroSys The full citation details ... 2024 DBLP  DOI  BibTeX  RDF
18Huy Nguyen, TrungTin Nguyen, Khai Nguyen, Nhat Ho Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts. Search on Bibsonomy AISTATS The full citation details ... 2024 DBLP  BibTeX  RDF
18Huy Nguyen, Khai Nguyen, Nhat Ho On Parameter Estimation in Deviated Gaussian Mixture of Experts. Search on Bibsonomy AISTATS The full citation details ... 2024 DBLP  BibTeX  RDF
18Parfait Munezero, Mattias Villani, Robert Kohn Dynamic Mixture of Experts Models for Online Prediction. Search on Bibsonomy Technometrics The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Tianfang Zhang, Rasmus Bokrantz, Jimmy Olsson A similarity-based Bayesian mixture-of-experts model. Search on Bibsonomy Stat. Comput. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Elham Abbasi, Mohammad Reza Hooshmandasl Semi-explicit mixture of experts based on information table. Search on Bibsonomy J. Ambient Intell. Humaniz. Comput. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zhichao Yan, Xuefeng Su, Qinghua Chai, Xiaoqi Han, Yunxiao Zhao, Ru Li 0001 Multiple POS Dependency-Aware Mixture of Experts for Frame Identification. Search on Bibsonomy IEEE Access The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yang Zhang, Radomír Gono, Michal Jasiñski An Improvement in Dynamic Behavior of Single Phase PM Brushless DC Motor Using Deep Neural Network and Mixture of Experts. Search on Bibsonomy IEEE Access The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Ala Shaabana, Zahra Gharaee, Paul W. Fieguth Video Relationship Detection Using Mixture of Experts. Search on Bibsonomy IEEE Access The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jinyu Xu, Qing Xie 0002, Jiachen Li 0002, Yanchun Ma, Yuhan Liu Mixture of Experts Residual Learning for Hamming Hashing. Search on Bibsonomy Neural Process. Lett. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yuxin Huang, Eric Hao Wang, Zhaoran Liu, Licheng Pan, Haozhe Li, Xinggao Liu Modeling Task Relationships in Multivariate Soft Sensor With Balanced Mixture-of-Experts. Search on Bibsonomy IEEE Trans. Ind. Informatics The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Francisco Souza 0001, Tim Offermans, Ruud Barendse, Geert J. Postma, Jeroen J. Jansen Contextual Mixture of Experts: Integrating Knowledge into Predictive Modeling. Search on Bibsonomy IEEE Trans. Ind. Informatics The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Guangran Cheng, Lu Dong 0002, Wenzhe Cai, Changyin Sun Multi-Task Reinforcement Learning With Attention-Based Mixture of Experts. Search on Bibsonomy IEEE Robotics Autom. Lett. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Lia Morra, Andrea Biondo, Nicola Poerio, Fabrizio Lamberti MIXO: Mixture Of Experts-Based Visual Odometry for Multicamera Autonomous Systems. Search on Bibsonomy IEEE Trans. Consumer Electron. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shibal Ibrahim, Wenyu Chen, Hussein Hazimeh 0001, Natalia Ponomareva 0001, Zhe Zhao 0001, Rahul Mazumder COMET: Learning Cardinality Constrained Mixture of Experts with Trees and Local Search. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yinqiong Cai, Yixing Fan, Keping Bi, Jiafeng Guo, Wei Chen 0034, Ruqing Zhang 0001, Xueqi Cheng CAME: Competitively Learning a Mixture-of-Experts Model for First-stage Retrieval. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yuxin Huang, Hao Wang, Zhaoran Liu, Licheng Pan, Haozhe Li, Xinggao Liu Modeling Task Relationships in Multi-variate Soft Sensor with Balanced Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yamuna Krishnamurthy, Chris Watkins, Thomas Gärtner 0001 Improving Expert Specialization in Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Feng Luo, Jinxi Xiang, Jun Zhang 0018, Xiao Han 0011, Wei Yang Image Super-resolution Via Latent Diffusion: A Sampling-space Mixture Of Experts And Frequency-augmented Decoder Approach. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Giang Do, Khiem Le, Quang Pham, TrungTin Nguyen, Thanh-Nam Doan, Bint T. Nguyen, Chenghao Liu, Savitha Ramasamy, Xiaoli Li 0001, Steven C. H. Hoi HyperRouter: Towards Efficient Training and Inference of Sparse Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jiamin Li, Qiang Su, Yitao Yang, Yimin Jiang, Cong Wang 0001, Hong Xu 0001 Adaptive Gating in Mixture-of-Experts based Language Models. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Vedang Asgaonkar, Aditya Jain, Abir De Generator Assisted Mixture of Experts For Feature Acquisition in Batch. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Boan Liu, Liang Ding 0006, Li Shen 0008, Keqin Peng, Yu Cao, Dazhao Cheng, Dacheng Tao Diversifying the Mixture-of-Experts Representation for Language Models with Orthogonal Optimizer. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jianchen Zhao, Cheng-Ching Tseng, Ming Lu, Ruichuan An, Xiaobao Wei, He Sun, Shanghang Zhang MoEC: Mixture of Experts Implicit Neural Compression. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Ted Zadouri, Ahmet Üstün, Arash Ahmadian, Beyza Ermis, Acyr Locatelli, Sara Hooker Pushing Mixture of Experts to the Limit: Extremely Parameter Efficient MoE for Instruction Tuning. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Artyom Eliseev, Denis Mazur Fast Inference of Mixture-of-Experts Language Models with Offloading. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zhixu Du, Shiyu Li, Yuhao Wu, Xiangyu Jiang, Jingwei Sun 0002, Qilin Zheng, Yongkai Wu, Ang Li 0005, Hai (Helen) Li, Yiran Chen 0001 SiDA: Sparsity-Inspired Data-Aware Serving for Efficient and Scalable Large Mixture-of-Experts Models. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Hassan Akbari, Dan Kondratyuk, Yin Cui, Rachel Hornung, Huisheng Wang, Hartwig Adam Alternating Gradient Descent and Mixture-of-Experts for Integrated Multimodal Perception. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Lucas Luttner Training of Neural Networks with Uncertain Data, A Mixture of Experts Approach. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Juan Gong, Zhenlin Chen, Chaoyi Ma, Zhuojian Xiao, Haonan Wang, Guoyu Tang, Lin Liu, Sulong Xu, Bo Long, Yunjiang Jiang Attention Weighted Mixture of Experts with Contrastive Learning for Personalized Ranking in E-commerce. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Wenxuan Wang, Guodong Ma, Yuke Li, Binbin Du Language-Routing Mixture of Experts for Multilingual and Code-Switching Speech Recognition. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Róbert Csordás, Piotr Piekos, Kazuki Irie, Jürgen Schmidhuber SwitchHead: Accelerating Transformers with Mixture-of-Experts Attention. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Haotao Wang, Ziyu Jiang, Yan Han, Zhangyang Wang Graph Mixture of Experts: Learning on Large-Scale Graphs with Explicit Diversity Modeling. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Huy Nguyen, Pedram Akbarian, TrungTin Nguyen, Nhat Ho A General Theory for Softmax Gating Multinomial Logistic Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Huy Nguyen, Pedram Akbarian, Fanqi Yan, Nhat Ho Statistical Perspective of Top-K Sparse Softmax Gating Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Chen Dun, Mirian Hipolito Garcia, Guoqing Zheng, Ahmed Hassan Awadallah, Robert Sim, Anastasios Kyrillidis, Dimitrios Dimitriadis Fed-ZERO: Efficient Zero-shot Personalization with Federated Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Rishov Sarkar, Hanxue Liang, Zhiwen Fan, Zhangyang Wang, Cong Hao Edge-MoE: Memory-Efficient Multi-Task Vision Transformer Architecture with Task-level Sparsity via Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yiming Sun, Bing Cao, Pengfei Zhu 0001, Qinghua Hu MoE-Fusion: Instance Embedded Mixture-of-Experts for Infrared and Visible Image Fusion. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jiaxin Gao, Qinglong Cao, Yuntian Chen MoE-AMC: Enhancing Automatic Modulation Classification Performance Using Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Rongsheng Wang, Haoming Chen, Ruizhe Zhou, Yaofei Duan, Kunyan Cai, Han Ma, Jiaxi Cui, Jian Li, Patrick Cheong-Iao Pang, Yapeng Wang, Tao Tan Aurora: Activating Chinese chat capability for Mixtral-8x7B sparse Mixture-of-Experts through Instruction-Tuning. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Sheng Shen, Zhewei Yao, Chunyuan Li, Trevor Darrell, Kurt Keutzer, Yuxiong He Scaling Vision-Language Models with Sparse Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zhenbang Du, Jiayu An, Jiahao Hong, Dongrui Wu Mixture-of-Experts for Open Set Domain Adaptation: A Dual-Space Detection Approach. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Isidora Chara Tourni, Subhajit Naskar Direct Neural Machine Translation with Task-level Mixture of Experts models. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Do Huu Dat, Po Yuan Mao, Tien Hoang Nguyen, Wray L. Buntine, Mohammed Bennamoun HOMOE: A Memory-Based and Composition-Aware Framework for Zero-Shot Learning with Hopfield Network and Soft Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Erik Daxberger, Floris Weers, Bowen Zhang, Tom Gunter, Ruoming Pang, Marcin Eichner, Michael Emmersberger, Yinfei Yang, Alexander Toshev, Xianzhi Du Mobile V-MoEs: Scaling Down Vision Transformers via Sparse Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Hanrong Ye, Dan Xu TaskExpert: Dynamically Assembling Multi-Task Representations with Memorial Mixture-of-Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Rongyu Zhang, Yulin Luo, Jiaming Liu, Huanrui Yang, Zhen Dong, Denis A. Gudovskiy, Tomoyuki Okuno, Yohei Nakata, Kurt Keutzer, Yuan Du, Shanghang Zhang Efficient Deweather Mixture-of-Experts with Uncertainty-aware Feature-wise Linear Modulation. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Huy Nguyen, TrungTin Nguyen, Khai Nguyen, Nhat Ho Towards Convergence Rates for Parameter Estimation in Gaussian-gated Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yuchang Jiang, Vivien Sainte Fare Garnot, Konrad Schindler, Jan Dirk Wegner Mixture of Experts with Uncertainty Voting for Imbalanced Deep Regression Problems. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Peikun Chen, Fan Yu, Yuhao Liang, Hongfei Xue, Xucheng Wan, Naijun Zheng, Huan Zhou 0004, Lei Xie 0001 BA-MoE: Boundary-Aware Mixture-of-Experts Adapter for Code-Switching Speech Recognition. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Sheng Shen, Le Hou, Yanqi Zhou, Nan Du 0002, Shayne Longpre, Jason Wei, Hyung Won Chung, Barret Zoph, William Fedus, Xinyun Chen, Tu Vu, Yuexin Wu, Wuyang Chen, Albert Webson, Yunxuan Li, Vincent Y. Zhao, Hongkun Yu 0001, Kurt Keutzer, Trevor Darrell, Denny Zhou Flan-MoE: Scaling Instruction-Finetuned Language Models with Sparse Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Siddharth Singh, Olatunji Ruwase, Ammar Ahmad Awan, Samyam Rajbhandari, Yuxiong He, Abhinav Bhatele A Novel Tensor-Expert Hybrid Parallelism Approach to Scale Mixture-of-Experts Training. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zihan Qiu, Zeyu Huang, Jie Fu Emergent Mixture-of-Experts: Can Dense Pre-trained Transformers Benefit from Emergent Modular Structures? Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Renteng Yuan, Mohamed A. Abdel-Aty, Qiaojun Xiang, Zijin Wang, Ou Zheng A Novel Temporal Multi-Gate Mixture-of-Experts Approach for Vehicle Trajectory and Driving Intention Prediction. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Mohammed Nowaz Rabbani Chowdhury, Shuai Zhang 0015, Meng Wang 0003, Sijia Liu 0001, Pin-Yu Chen Patch-level Routing in Mixture-of-Experts is Provably Sample-efficient for Convolutional Neural Networks. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yekun Chai, Qiyue Yin, Junge Zhang Improved Training of Mixture-of-Experts Language GANs. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shihan Dou, Enyu Zhou, Yan Liu, Songyang Gao, Jun Zhao 0019, Wei Shen, Yuhao Zhou, Zhiheng Xi, Xiao Wang, Xiaoran Fan, Shiliang Pu, Jiang Zhu, Rui Zheng, Tao Gui, Qi Zhang 0001, Xuanjing Huang 0001 LoRAMoE: Revolutionizing Mixture of Experts for Maintaining World Knowledge in Language Model Alignment. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Huy Nguyen, TrungTin Nguyen, Nhat Ho Demystifying Softmax Gating in Gaussian Mixture of Experts. Search on Bibsonomy CoRR The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Aditya Shankar Mishra, Santosh Singh Rathore Implicit and explicit mixture of experts models for software defect prediction. Search on Bibsonomy Softw. Qual. J. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jianying Chen, Yun Xue, Haolan Zhang 0001, Weiping Ding 0001, Zhengxuan Zhang, Jiehai Chen On development of multimodal named entity recognition using part-of-speech and mixture of experts. Search on Bibsonomy Int. J. Mach. Learn. Cybern. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Nathan C. Hurley, Sanket S. Dhruva, Nihar Desai, Joseph R. Ross, Che Ngufor, Frederick Masoudi, Harlan M. Krumholz, Bobak J. Mortazavi Clinical Phenotyping with an Outcomes-driven Mixture of Experts for Patient Matching and Risk Estimation. Search on Bibsonomy ACM Trans. Comput. Heal. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Evan Kawamura, Dilmurat Azimov, John S. Allen, Corey Ippolito Hierarchical mixture of experts for autonomous unmanned aerial vehicles utilizing thrust models and acoustics. Search on Bibsonomy Robotics Auton. Syst. The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Hanrong Ye, Dan Xu TaskExpert: Dynamically Assembling Multi-Task Representations with Memorial Mixture-of-Experts. Search on Bibsonomy ICCV The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Tianlong Chen, Xuxi Chen, Xianzhi Du, Abdullah Rashwan, Fan Yang, Huizhong Chen, Zhangyang Wang, Yeqing Li AdaMV-MoE: Adaptive Multi-Task Vision Mixture-of-Experts. Search on Bibsonomy ICCV The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shuqing Bian, Xingyu Pan, Wayne Xin Zhao, Jinpeng Wang 0001, Chuyuan Wang, Ji-Rong Wen Multi-modal Mixture of Experts Represetation Learning for Sequential Recommendation. Search on Bibsonomy CIKM The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shuhao Li, Yue Cui 0001, Yan Zhao 0008, Weidong Yang, Ruiyuan Zhang, Xiaofang Zhou 0001 ST-MoE: Spatio-Temporal Mixture-of-Experts for Debiasing in Traffic Prediction. Search on Bibsonomy CIKM The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Stefano Pini, Christian S. Perone, Aayush Ahuja, Ana Sofia Rufino Ferreira, Moritz Niendorf, Sergey Zagoruyko Safe Real-World Autonomous Driving by Learning to Predict and Plan with a Mixture of Experts. Search on Bibsonomy ICRA The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Juan Gong, Zhenlin Chen, Chaoyi Ma, Zhuojian Xiao, Haonan Wang, Guoyu Tang, Lin Liu, Sulong Xu, Bo Long, Yunjiang Jiang Attention Weighted Mixture of Experts with Contrastive Learning for Personalized Ranking in E-commerce. Search on Bibsonomy ICDE The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Gal Blecher, Shai Fine MoEAtt: A Deep Mixture of Experts Model using Attention-based Routing Gate. Search on Bibsonomy ICMLA The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Henok Ghebrechristos, Stence Nicholas, David Mirsky, Manh Huynh, Zackary Kromer, Ligia Batista, Gita Alaghband, Brent O'Neill, Steven Moulton, Daniel M. Lindberg Deep Learning Mixture-of-Experts for Cytotoxic Edema Assessment in Infants and Children. Search on Bibsonomy ISBI The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Joe Germino, Nuno Moniz, Nitesh V. Chawla Fairness-Aware Mixture of Experts with Interpretability Budgets. Search on Bibsonomy DS The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Peikun Chen, Fan Yu, Yuhao Liang, Hongfei Xue, Xucheng Wan, Naijun Zheng, Huan Zhou 0004, Lei Xie 0001 BA-MoE: Boundary-Aware Mixture-of-Experts Adapter for Code-Switching Speech Recognition. Search on Bibsonomy ASRU The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Zheyuan Liu 0010, Chunhui Zhang, Yijun Tian 0001, Erchi Zhang, Chao Huang 0001, Yanfang Ye 0001, Chuxu Zhang Fair Graph Representation Learning via Diverse Mixture-of-Experts. Search on Bibsonomy WWW The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Paulo E. Linares Otoya, Shinfeng D. Lin A Mixture-of-Experts (MoE) Framework for Pose-Invariant Face Recognition via Local Landmark-Centered Feature Extraction. Search on Bibsonomy TAAI (2) The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Shuyang Li, Enrico Magli, Gianluca Francini To be Conservative or to be Aggressive? A Risk-Adaptive Mixture of Experts for Mobile Traffic Forecasting. Search on Bibsonomy ICC The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yi-Hsin Li, Mårten Sjöström, Sebastian Knorr, Thomas Sikora Segmentation-based Initialization for Steered Mixture of Experts. Search on Bibsonomy VCIP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Yekun Chai, Qiyue Yin, Junge Zhang Improved Training Of Mixture-Of-Experts Language GANs. Search on Bibsonomy ICASSP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Jian Xu, Yang Lei, Guangqi Zhu, Yunling Feng, Bo Xiao, Qifeng Qian, Yajing Xu SL-MoE: A Two-Stage Mixture-of-Experts Sequence Learning Framework for Forecasting Rapid Intensification of Tropical Cyclone. Search on Bibsonomy ICASSP The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
18Juncai Liu, Jessie Hui Wang, Yimin Jiang Janus: A Unified Distributed Training Framework for Sparse Mixture-of-Experts Models. Search on Bibsonomy SIGCOMM The full citation details ... 2023 DBLP  DOI  BibTeX  RDF
Displaying result #101 - #200 of 631 (100 per page; Change: )
Pages: [<<][1][2][3][4][5][6][7][>>]
Valid XHTML 1.1! Valid CSS! [Valid RSS]
Maintained by L3S.
Previously maintained by Jörg Diederich.
Based upon DBLP by Michael Ley.
open data data released under the ODC-BY 1.0 license