MoE
技术混合专家架构,文中提到35B-A3B MoE及122B-A10B MoE
262 次提及120 个连接最近出现: 2026-04-29
关系图谱
关系 (135)
使用技术 (93)
Qwen3DeepSeek V3Kimi MoonshotGLM-5Qwen3.5-Omni深度求索 DeepSeekFlash AttentionQwen3.5-397B阿里巴巴Qoder美团AgnesClawLongCat-NextKAT-Coder-Pro V2GLM-5.1HippoGeniusProMoEGemma 4Gemma 4 26BStep 3.5 FlashGemma 4 26B MoE腾讯MiMo大模型GO-1UniMMADDeepSeek R1JoyAI-LLM FlashClaude MythosAgentMoE-32BMiniMax-01LLaMA 4Claude Opus 4.6HEXDeepSeek V2M2.7ERNIE-4.5-VL-28B-A3BLatentUM小红书RelaxHY-World 2.0Suiren-BaseQwen3.6-35B-A3BMega MoEGPT-5DeepSeek V4NVIDIATransformerMoDAOpenMythosClaude Sonnet 4.6Fun-ASR1.5Attention MechanismTogether AISageLing-2.6-flash百灵Ling-2.6-flashSTCastQwen3.6-27BGemini 2.0Seed3D 2.0MiMo-V2.5-ProGPT Image 2TPU 8iHy3 preview混元3 Preview混元 Hy3 PreviewQwen3.5-397B-A17BGPT-4oCoInteractHMoETileKernels混元Hy3混元OmniHuman-1DeepSeek V4-ProDeepSeek-V4-FlashClaude 3.5 SonnetLongCat-FlashMatRIS-MoEKimi K2.6Qwen3.6-Max-PreviewAgnes-1.5-ProQwen2.5WorldScapeLongCat-2.0-PreviewMiMo-V2.5SenseNova U1MiMo-V2.5元神AIU1 LiteNemotron 3 Nano OmniMotuBrain
应用于 (29)
竞争 (2)
合作 (1)
相关文章 (262)
下滑加载更多...(已显示 30 / 262)