Add SDPA fallback for Siglip2Navit attention

#5
by Isotr0py - opened

Add SDPA fallback for ViT Attention to allow running inference on machine without flash-attn

Isotr0py changed pull request title from sdpa-fallback to Add SDPA fallback for Siglip2Navit attention
Isotr0py changed pull request status to open

It seems like not support batch but single inference? Thanks for your work!

xxyyy123 changed pull request status to merged
AIDC-AI org

@Isotr0py Thank you for adding SDPA support🚀, which facilitates running inference in environments without flash-attn.

Sign up or log in to comment