Computational Algebra with Attention: Transformer Oracles for Border Basis Algorithms
Published in Conference on Neural Information Processing Systems (NeurIPS), 2025
We introduce a learning-augmented approach to computational algebra, where transformer models are trained to serve as oracles for a classical symbolic algorithm. Specifically, we focus on border basis algorithms, demonstrating significant speedups.
Recommended citation: Kera, H.*, Pelleriti, N.*, Ishihara, Y., Zimmer, M., & Pokutta, S. (2025). Computational Algebra with Attention: Transformer Oracles for Border Basis Algorithms. Conference on Neural Information Processing Systems (NeurIPS). San Diego, CA. (*equal contribution)
Download Paper
