Embedding-Aware Quantum-Classical SVMs for Scalable Quantum Machine Learning
Abstract
Combining Vision Transformer embeddings with quantum-classical pipelines achieves quantum advantage in classification tasks, demonstrating the importance of embedding choice in quantum machine learning.
Quantum Support Vector Machines face scalability challenges due to high-dimensional quantum states and hardware limitations. We propose an embedding-aware quantum-classical pipeline combining class-balanced k-means distillation with pretrained Vision Transformer embeddings. Our key finding: ViT embeddings uniquely enable quantum advantage, achieving up to 8.02% accuracy improvements over classical SVMs on Fashion-MNIST and 4.42% on MNIST, while CNN features show performance degradation. Using 16-qubit tensor network simulation via cuTensorNet, we provide the first systematic evidence that quantum kernel advantage depends critically on embedding choice, revealing fundamental synergy between transformer attention and quantum feature spaces. This provides a practical pathway for scalable quantum machine learning that leverages modern neural architectures.
Community
Embedding-aware quantum pipelines achieve quantum ML advantage, with transformer features key to outperforming classical models.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- QMoE: A Quantum Mixture of Experts Framework for Scalable Quantum Neural Networks (2025)
- Hardware-Aware Quantum Kernel Design Based on Graph Neural Networks (2025)
- Quantum Adaptive Excitation Network with Variational Quantum Circuits for Channel Attention (2025)
- Genetic Transformer-Assisted Quantum Neural Networks for Optimal Circuit Design (2025)
- Unitary Scrambling and Collapse: A Quantum Diffusion Framework for Generative Modeling (2025)
- Devanagari Digit Recognition using Quantum Machine Learning (2025)
- A Resource Efficient Quantum Kernel (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper