HEBATRON: A Hebrew-Specialized Open-Weight Mixture-of-Experts Language Model · DeepSignal
HEBATRON: A Hebrew-Specialized Open-Weight Mixture-of-Experts Language Model
arXiv cs.CL·Noam Kayzer, Dan Revital, Ori Bar Joseph, Smadar Arvatz, Or Levi, Tal Geva, Shaltiel Shmidman, Amir DN Cohen, Noam Ordan, Omer Baruch, Kate Zinkovskaia, Zevi Apini, Sarel Weinberger
4d ago
·~1 min·5/13/2026·en·1
Quick Take
Hebatron is a Hebrew-specialized open-weight Mixture-of-Experts language model achieving high performance on Hebrew reasoning tasks.
Key Points
Built on NVIDIA Nemotron-3 architecture.
Achieves 73.8% average Hebrew reasoning accuracy.
First open-weight Hebrew MoE model with long-context support.
Hebatron's high performance on Hebrew reasoning tasks signals a significant advancement in language models, providing developers, PMs, and investors with new opportunities in specialized AI applications for Hebrew-speaking markets.