Baidu: ERNIE 4.5 21B A3B
BaiduID: baidu/ernie-4.5-21b-a3b
A sophisticated text-based Mixture-of-Experts (MoE) model featuring 21B total parameters with 3B activated per token, delivering exceptional multimodal understanding and generation through heterogeneous MoE structures and modality-isolated routing. Supporting an extensive 131K token context length, the model achieves efficient inference via multi-expert parallel collaboration and quantization, while advanced post-training techniques including SFT, DPO, and UPO ensure optimized performance across diverse applications with specialized routing and balancing losses for superior task handling.
Pricing per 1M Tokens
| Input (Prompt) | $0.07 |
| Output (Completion) | $0.28 |
| Cache Read | Free |
| Cache Write | Free |
| Image | N/A |
Specifications
| Context Length | 120K |
| Max Output Tokens | 8K |
| Input Modalities | Text |
| Output Modalities | Text |
| Tokenizer | Other |
| Instruct Type | N/A |
| Top Provider Context | 120K |
| Top Provider Max Output | 8K |
| Moderated | No |
Compare this model
See how Baidu: ERNIE 4.5 21B A3B stacks up against other models.
More from Baidu
Last updated: March 23, 2026
First tracked: March 23, 2026