HF Space is broken

#30
by dumb-dev - opened

image.png

FlashAttention2 has been toggled on, but it cannot be used due to the following error: Flash Attention 2 is not available on CPU. Please make sure torch can access a CUDA device.

Need for a gpu for better demo?

Without a GPU machine, the model will be very slow. You can run this colab notebook to test out the base model quickly https://github.com/NanoNets/docext/blob/main/PDF2MD_README.md#getting-started

We are working on creating a chat platform; you can try out that as well and share feedback. https://docstrange.nanonets.com/.

Sign up or log in to comment