Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

P100 - oh my so Xformers for Flash Attention I think does support it, but Triton supports Compute Capability 7.0+, whilst a P100 is 6.0 :(

So technically the code can run, but I'll have to edit it to remove the Triton changes.



Thank you! I generally don't see as much support for them and needs to use old versions of it was supported. This trend is happening more and more, it's unfortunate but it seems like we need to use a newer GPU now.


Ye unfortunately :( I tried my best to support all GPUs from 2018 onwards - for eg sadly Flash Attention v2 is only Ampere+, whilst Xformers is nicely Tesla T4+




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: